How To Apply Big Data Concepts to a Traditional Data Warehouse

by Jamie Jackson, on July 8, 2016


You don’t need to commit to Big Data to improve your data initiatives - you just need to adopt Big Data methodologies. Most companies don’t yet need to fully convert to Big Data frameworks, and are really just trying to improve their internal BI capabilities.

For example, traditional Data Warehouses are vastly dependant on ETL integration tools to be the centralized storage of business rule transformation logic. The logic to take raw source system data and turn it into business KPI’s has been locked away and far removed from the business/data analysts who understand/need them the most - until now.

By shifting a couple consonants, companies have been able to put the transformation logic back in the hands of the Data Analysts and BI Developers, instead of being dependent on ETL Developers. With the recent progress of Big Data, companies are been far better positioned to break the traditional approach of ETL (Extract, Transform, Load) and move forward with more of an ELT (Extract, Load, Transform) methodology. 

Here are a few other ways to integrate Big Data philosophy into your traditional RDMS stack:

  • Focus on EL: Leverage integration tools that enable you to EL (extract and load).  Use their built in connectors that simplify the bulk retrieval of data via your systems api’s.  Savvy data analysts are perfect candidates to pick up and develop a “straight move” integration mapping or job.  

  • Consolidate to SQL: If your data is already in a relational format, then just land the data into your SQL Server/MySQL/etc.  SQL is the universal language for data people.  So why not use that as a strength and eliminate the redundancy of re-building all the relationships that already exist in your systems?  

  • Utilize Raw Data: If you allow your analysts to transform the raw data as they see fit, it's much easier to establish standards and best practices. You've then also engage your key players in the governance and can maximize the performance of your transformations, instead of keeping them in each analyst's proprietary silos (SAS programs, Access databases, Excel spreadsheets, etc.).

If your organization's data isn't rapidly changing in format, isn't measured in petabytes (one billion megabytes), or isn't highly unstructured (text), then all of this will improve your data initiatives, and better position your business to migrate to a big data platform in the future.

Need more guidance on how to optimize your current data systems? Click here to contact us and we’ll help you get the most out of your data! 

Topics:Data Science

The Arkatechture Blog

A place for visualization veterans, analytics enthusiasts, and self-aware artificial intelligence to binge on all things data. 

Subscribe to our Blog