Data Junction Corporation, developers of integration solutions, announced that MedAssets HSCA, the third largest group purchasing organization in the country, is using Data Junction data integration technology as part of its Strategic Information service that enables healthcare institutions to maximize their financial performance. By exchanging and integrating data among their clients' many different systems and their own application, MedAssets HSCA audits purchasing records and delivers significant financial improvements to its customers' supply chain and contract services performance.

MedAssets HSCA's Strategic Information project is a system for matching hospital and healthcare provider purchasing information with contractual information to determine if the best price was paid for specific items and services. MedAssets HSCA built a Web service written in .Net C-Sharp to integrate information from various sources in a variety of formats with their own backend database. The integration challenges are numerous - data is not necessarily represented the same way in both the MedAssets HSCA systems and their clients' systems; data fields may be listed in different orders; and codes identifying specific items are likely represented differently in different systems. So massaging this data before integrating it into the system is necessary and vital to the project.

MedAssets HSCA receives weekly and monthly updates from their clients' commercial and homegrown enterprise resource planning systems (ERPs). Files are delivered in a variety of formats with varying levels of data integrity. If the data is clean then it is simply mapped to MedAssets HSCA's database directly using Data Junction intuitive graphical user interface (GUI). However some files contain legacy, mainframe data or report data that needs to be manipulated. In these cases Content Extractor is used to create the data associations and mine the required information from the reports. Then it is imported into Strategic Information's database using Data Junction's mapping tools.

In either case, with or without the use of Content Extractor, the import batch process is automated to run in near real time. Once the process is initially built the system handles the import routine automatically without the user every having to deploy Data Junction technology. Data Junction's Integration Engine is running on a database server and the service runs polling for jobs. When a job comes across the message queue, the service initiates the correct transformation process and imports the information into the backend database. Once completed, a pop-up window automatically appears notifying them the job is finished.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access