We provide care design, informatics, research and analysis, education program design, and program management that bridges the boundaries between the functional, clinical, and technical worlds.
Case Study: Enabling Rapid Response to Health Emergencies through Analytics
More than 200 million gallons of crude oil were pumped into the Gulf of Mexico for 87 days, making it the biggest oil spill in U.S. history. Overall, 16,000 miles of coastline were affected, including the coasts of Texas, Louisiana, Mississippi, Alabama, and Florida. Even though the gushing well was capped in July 2010, oil is still washing up on some shores, which might cause long-term health damages to people living in the area.
A group of agencies, including EPA, NOAA, FEMA, Coast Guard, and others teamed up within hours of the event, focused on monitoring air, water, sediment, and waste generated by the spill and cleanup operations. The purpose of this collaboration was to analyze the potential pollutants for immediate and long-term health risks as well as provide this data to the citizens of the Gulf Coast area as quickly and accurately as possible.
In support of the BP Oil Spill cleanup efforts, EPA was responsible for monitoring air and water quality throughout the Gulf region to detect anomalies generated by the spill as well as the chemicals used for cleanup. To do this, EPA need an automated way to collect, consolidate, analyze, and publish the results of the ongoing air and water samplings. These results were of critical importance to numerous groups, and ensuring the accurate dissemination of this data was vital to the health and safety of the general public. EPA also needed to make certain that the information was easy to understand and relevant to each audience. An already challenging task was made more difficult by the fact that this critical information needed to be delivered to citizens as quickly as possible, even while the oil spill raged.
Directly after the BP Oil Spill occurred, the EPA and its Federal agency partners turned to Salient CRGT to support its response to the event. Knowing that time was short, Salient CRGT worked around-the-clock in coordination with numerous agencies to create a web-based application designed for analysis and publishing of air, water, and sediment quality data.
With this application, large quantities of raw data were transformed into a near-real-time data mart designed to facilitate accurate workflow of the inspection process, comprehensive analysis by scientists, timely reporting, and information clarity. Using its proprietary data analytics methodology, Salient CRGT created a simple way to provide the general public immediate location-specific access to vital air, water, and sediment quality reports. During the first months after the spill, this website received over 30,000 hits per day.
Together, Salient CRGT and the government worked night and day throughout the months during the spill as well as beyond the spill to ensure data was reported to citizens with accuracy, speed, and full transparency.
As with every data analytics project that Salient CRGT delivers, this project used our ever-evolving analytics methodology and added ideas and best practices to it for future efforts. As a partner to the Federal government, Salient CRGT is continuing to leverage its best practices around data collection, analysis, and visualization, to help customers throughout the government push forward in their missions. Together, this partnership is focused on improving citizen health and safety, data transparency, and overall quality using innovative analytics solutions and systems.