bluetree blog

see all posts

Getting Data from Epic Caboodle back into Chronicles

You’ve figured out how to get data from Epic and other external systems external into Caboodle. Now you want to use that data to drive reports in Epic, so you have your next data challenge: Getting data from Caboodle back into Epic.

Assuming there’s not a standard Epic interface available to move your data, you have a couple of choices to get data into Chronicles – Datalink or Epic web services. Let’s compare them to help you decide which you might use.

How Does Datalink Work?

Loading data into Epic via Datalink can be a straightforward process, particularly if Epic has already created the building blocks necessary for the type of data you are working with. A Datalink operation has three parts: Action, Action Criteria, and Execution.

The Datalink Action defines where the data will be loaded into Chronicles. Epic has released several commonly used Datalink Actions for data such as Patient Lists, Care Team, Claims, and SmartData Elements. While it is possible to create your own Datalink Actions, it requires some Caché development to do it correctly, which is not an exercise for the faint of heart.

The second part of the Datalink operation, the Datalink Action Criteria defines the specifics of the data you are loading. For example, if you are loading SmartData Elements, the criteria would include the SmartData Identifier of the SmartData Element and several other parameters, such as context, encounter type (if necessary), and the type of patient ID you are sending (MRN, Epic ID, some other external ID). You also provide a query or stored procedure (which is parameterizable) for the Datalink operation to call during runtime. These records will need to be created for your specific use case.


RELATED: Two Blueleaf Analytics Specialists contribute to clinical analytics book

The final piece of Epic development required for a Datalink operation is the Datalink Execution. This is created in the Clarity Console, whereas the other pieces are both created in Hyperspace (or released by Epic). Here you select all the Datalink Action Criteria you want to run, set up any dependencies, point the Clarity Console to the appropriate database, and run the execution to load the data.

What Are Web Services?

Web services are queries or data that are sent via standard HTTP protocols. There are two main standards, and Epic has many released services using the REST web service protocol. Web services are generally grouped by whether they are used to read data (GET services) or to write data (SET services).

Calling a web service always involves writing some code, typically, but not necessarily, in C#. You will need to code to authenticate, and then code to call the appropriate web service, search data, send the appropriate data to Chronicles, and finally handle any errors that Chronicles may return.

Pros and Cons of Datalink

The biggest advantage of Datalink is integration. Because everything is built in Epic, you don’t need to build out authentication, and the development requirements are minimal. Someone in your organization will need to create the necessary SQL code, but, assuming a Datalink Action Criteria exists for the data you want to load, no other development is required –  only creating records in Hyperspace and the Clarity Console.

The lack of flexibility in Datalink is its biggest weakness. If you need to load a type of data that isn’t supported by the released Datalink Actions, you will need to do Caché development to create a satisfactory Datalink Action. Another minor quibble about Datalink is that all of the errors that occur during the execution are stored in the Clarity Console. They are not easily made available to the Caboodle tables if you wanted to store error information with the row that caused the error. It can be done, but it is significantly simpler with web services, where any errors are returned for each call.

Pros and Cons of Web Services

The biggest advantage of Epic web services over Datalink is flexibility. Using web services, you can GET many elements out of Chronicles in real time and SET a large subset of those. Using web services also allows for flexibility in initial data gathering, and potentially for more manipulation of data before sending it. It also allows for immediate feedback for errors, which can potentially be handled within the code itself. For example, take a case where you have multiple IDs for a given patient (due to patient merge), and you don’t know which one is accurate. Using a web services solution, you could loop through the IDs before sending the data using GET services to see which one is accurate, then send the appropriate one using the SET service.


RELATED: Strategic Scorecards: Improving pain management care and documentation

The downside of using web services is creating and maintaining code. Even if your organization has development expertise in-house, that expert might not be part of the analytics team. As more teams get involved, your analytics projects might become more complex. Development using web services will also generally take longer, as the error handling and authentication pieces that you get for free using an Epic integrated solution require development if you are writing code outside of Epic. Additionally, you will have to maintain the code – if anything changes upstream, it will require changes to code that will take time to complete and test.

Performance Considerations

You need to consider how your data load method will affect performance. First is the overhead of each execution, which is determined by the number of calls to Epic that each methodology makes. This will be highly dependent on the type of data you are loading. We’ll use SmartData Elements as a case study to compare:

  • For Datalink, you are required to have one Action Criteria for each SmartData Element, and each call will send all patients that the SmartData Element should be added to.
  • For web services, you make one web service call for each patient, and send all SmartData Elements to be added to that patient.

If you have a large number of patients and a small number of SmartData Elements, which will generally be the case, Datalink wins this one easily.

The second performance consideration is the writing of data to Caché. Taking our example of SmartData Elements, and due to the way data are stored in Caché, writing all of the SmartData Elements to a single patient at the same time is actually more performance friendly than the alternative, where you write one to one patient, then write the same one to the next patient, and so on. This is generally minor, but web services win this round.

So, Which One Should I Use?

Consider the following questions to help you decide whether to use web services or Datalink to move data from Caboodle to Chronicles:

  • If we use web services, do we have the expertise to create and maintain it?
  • Does Epic have a released Action for the data we are looking to load? If not, do we have the Caché expertise to create and maintain it?
  • How much data are we loading, and what are the performance implications?
  • Do I need immediate complex error handling, or will the error handling provided by Epic in the Clarity Console suffice? While each situation is different, the answers to these questions will point you in the right direction. In general, choosing Datalink is the safer choice, but provides less overall flexibility if you want to do something that doesn’t fit snuggly within the typical Epic use cases.

Have a few questions for Jason? Fill out the form below!

like this post?