Join top executives in San Francisco on July 11-12, to hear how leaders are integrating and optimizing AI investments for success. Learn More
Big Data keeps getting bigger. In 2021, the Big Data Landscape grew to 2,025 technological tools, all promising to help businesses solve their data needs. The “pared down” version still offers seven categories and 96 sub-categories, with tools ranging from the old guard to cutting edge. To navigate this landscape, businesses need a framework to ensure they understand their needs and leverage tools to help them get the most out of data.
Enter technology rationalization, a framework for assessing your data needs, prioritizing and road mapping those needs, and then finding the best tool for the job. It isn’t just about adding new tools and technologies. It’s primarily about assessing what your data needs and whether what you currently have is the best fit. Technology rationalization helps businesses assess needs, leverage tools effectively and cut bloat from their data stack.
Celebrity chef Alton Brown can help guide you. If you’re an Alton fan like me, you know he has a rule against “unitaskers,” kitchen devices created for a single task.
Alton’s “no single-use tools” rule can be applied to company data stacks as well. Much like I am reluctant to invest in a cherry pitter when I already have a paring knife, I always think twice about recommending single-use data tools in a data stack. If you don’t understand your business needs and how data can support them, you risk buying that cherry pitter only to have it gathering dust and taking up valuable kitchen real estate.
Join us in San Francisco on July 11-12, where top executives will share how they have integrated and optimized AI investments for success and avoided common pitfalls.
No matter how excellent a team’s data flow diagrams are, data stacks are filled with nuance that makes each company unique; data and how it’s leveraged, is part of a company’s competitive advantage. This complexity, however, can become a disadvantage when it leads to redundancy.
Reducing the complexity of your stack leads to a streamlined process and greater transparency for tracing failures, lags and secondary needs. With fewer “points of failure” to monitor, your data engineering team can focus on adding value — not remediating problems.
Need another reason? A less complex data stack can help expedite on-boarding of new team members and general knowledge transfer. Lowering the hurdle to learning new tools means more of your team can become full-stack data engineers with the ability to help from end-to-end. A more streamlined data stack also keeps everyone happy by helping reduce instance costs. While a more efficient data stack does not always correlate to lower instance costs, reducing bulk typically has cost implications. Reducing the number of tools you’re paying for, as well as less time spent on-boarding or troubleshooting, translates into lower costs.
But technology rationalization’s less-is-more approach doesn’t always mean forgoing some more specialized tools. Going back to Alton’s kitchen, if I start making a cherry pie every weekend, that single-use cherry pitter would become very necessary. Reducing clutter and complexity in your data stack can mean fewer tools, but it primarily means finding and using the right tools. Each platform should have a clear value-add that, even with its addition, reduces the overall complexity of your stack. Meaning, before you rationalize your technology, you must first assess your needs.
Start from square one: What business needs do you want your data to fill? And what business questions are your stakeholders asking? Can your data repeatedly answer these questions in a timely manner?
There are thousands of single-use kitchen gadgets out there. The number of data tools is not far behind, so your team must be able to identify where gaps exist with an understanding of what needs should be prioritized to quash the clutter.
Starting the process can feel daunting. The first step is a conversation about where you are and where you want to be. Like Alton Brown’s rule can help consolidate kitchen utensils, technology rationalization can streamline your data stack.
Laura McKinley is a principal consultant and solution architect at DAS42 with extensive experience in business intelligence, data warehousing, data engineering and analytics solutions architecture.
Welcome to the VentureBeat community!
DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.
If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.
You might even consider contributing an article of your own!