Now that we’ve discussed the full lifecycle for implementing data analytics capabilities and deriving insight, how does one sustain it going forward? In the past we participated in data warehousing or data analytics projects that were seemingly successful and received positive reviews when completed. However, when we touched base with the team a year later
The Revolution At the end of 2007 classic, Competing on Analytics, Tom Davenport predicted the rise of “analytical amateurs,” where the enterprise extends its frontline data analytics capabilities across the organization in its effort to become a data-driven enterprise. That transformation would be done largely without assistance from traditional information technology department capabilities and processes.
Last week, we introduced the assessment criteria for selecting the right tools for your big data platform. This week in the Big Data Diamonds series, we fill focus on using Agile implementation and methodology to continuously drive value and insight. Maven Wave employs the Agile methodology for delivering solutions to our clients and the incremental and iterative
Now that you have identified the components required to define your big data architecture (as discussed in last week's Big Data Diamonds blog post), we will introduce the assessment criteria for selecting the right big data tools. Selecting the right tools for your big data platform is not as easy as selecting the best application on
In last week's Big Data Diamonds blog post, we discussed the importance of assessing your data to ensure that you understand it. This week we are introducing the components required to define the big data architecture. Once you understand the data and business objectives, you are equipped with the inputs necessary to define a robust architecture.
Most great accomplishments are not delivered by great ideas alone. It takes a team with the right skills to deliver a quality result. The same holds true when delivering a big data project. In our last Data Diamonds blog post, we talked about starting any big data project with a well-defined purpose and the importance of
Data as a Service is a strategy that has been around, but only now have we observed it really coming into its stride. It’s used to access business-critical data in real-time, in a secure, affordable, “cloud” manner. For decades, businesses have sought to become more data-driven, making decisions more of a science than an art.
Executives recognize that the ability to effectively retain and recall knowledge is a mission-critical capability that must be approached strategically. Knowledge assets clearly create corporate value -- perhaps most measurably in the healthcare industry through drug approvals and patent protection. A granted patent or approval converts knowledge or data into a clearly defined commercial opportunity
Unofficially, a 2011 McKinsey white paper marked the beginning of the popularization of the term big data1. Since then, there have been enough large data set projects executed that we are starting to learn why these types of projects sometimes fail. In our opinion, the challenges associated with large data projects fall into five categories: