Skip to main content
Tuesday, April 8, 2014 - 12:45pm - 1:45pm
Special Topics

Become a Big Data Quality Hero

Many believe that regression testing an application with minimal data is sufficient. However, the data testing methodology becomes far more complex with big data applications. Testing can now be done within the data fabrication process as well as in the data delivery process. Today, comprehensive testing is often mandated by regulatory agencies—and more importantly by customers. Finding issues before deployment and saving your company’s reputation—and in some cases preventing litigation—is critical. Jason Rauen presents an overview of the architecture, processes, techniques, and lessons learned by an original big data company. Detecting defects up-front is vital. Learn how to test thousands, millions, and in some cases billions—yes, billions—of records directly, rendering sampling procedures obsolete. See how you can save your organization time and money—and have better data test coverage than ever before.

Jason Rauen, LexisNexis

Jason Rauen is a senior quality test analyst at Georgia-based LexisNexis Risk Solutions. With more than fifteen years of experience, Jason has led the data testing team in big data from its inception. He has presented big data scripting techniques at HPCC Systems national Data Summit. His background includes working at companies including Microsoft, AT&T, and LexisNexis, and instructing at Intel, Boeing, Executrain, and the Department of the Navy.

read more