Image Image Image Image Image Image Image Image Image

02 Nov

By

No Comments

“There’s an Abundance of Data but a shortage of thinking about Data”.

November 2, 2011 | By | No Comments

This is a line from Monday’s presentation by Steve Levitt, the author of FREAKONOMICS, at TeraData Partners 11. His presentation has been a real hit if the Twitter traffic is anything to go by. The flow of tweets on #TDPUG11 dried up for a while and then there was a deluge.  This is the new metric of a quality presentation; when a presentation is so good that it ‘stops the traffic’ on Twitter!Steve’s presentation is worth a whole article in it’s own right but let’s focus on that one line – ‘there’s an abundance of Data but a shortage of thinking about Data’. I’d been reading recently about Process BI: the marrying of Business Process Management techniques and BI. It’s an idea that has been around for a while but what struck me was the thought that IT gladly helps the business develop these types of applications to improve business operational efficiency by identifying unnecessary costs or to find ways to increase customer satisfaction.

However in IT we rarely use the same techniques on our applications and processes. Yet, by doing so, we can achieve a host of direct and indirect benefits through the streamlining of information to analysts and decision makers and readying the ground for high latency, information-rich BI.

OK, I hear you say, but we can barely keep up here never mind stand back and survey what we’re doing. But before you dismiss the idea, just how far away are from being able to realise this? We have the data; an abundance, as Steve put it, in the Informatica Repository and elsewhere in the infrastructure. So how can we think about the data? Fortunately in the Informatica world, we have the tools to interpret this data.

Consider this, Fred Hargrove, who writes about these matters for Information Management, identified that to achieve and maintain successful execution of process-oriented BI, five key things are needed:

  • End-to-end business process knowledge
  • Continuous improvement mindset
  • BI capabilities of operations
  • Data governance discipline
  • Data latency reduction.

What does this mean for those of us at the Data Integration coalface? Well, put simply, we must:

  • Understand the end to end process from source system(s) to the BI reports, data feeds and the other outputs from your Data integration
  • Be committed to making things work better
  • Give operational owners the user-friendly reporting and analysis tools they need to assess problems and issues without getting lost in the data
  • Put an effective data governance program in place since if there is no assurance of the quality of the data then it cannot be made actionable
  • Right size how quickly the information is produced and how relevant it is to the current business situation by the time it is used.

Now if you’re still wondering what all of this has to do with you, your organisation, or even Informatica; or if you suspect that I might just be day-dreaming of some ideal world; let me map these pre-requisites to the toolsets of Informatica and my own company, Assertive:

  • MetaData Manager gives us the end-to-end process view
  • The Lean Integration, or Factory model, gives us the methodology and justification to implement a continuous improvement process
  • HIPPO from Assertive provides the Operational BI reporting capabilities
  • Informatica’s Data Governance tools are second to none
  • Informatica’s tools provide whatever Data Latency level you can justify

“Well, OK” I hear you say, “we accept that we have the data in the Repository, and elsewhere, and that we can have the toolset but how do we justify spending time and effort analysing how we can optimise performance when we are struggling with ever increasing workloads as it is”.

In Data Integration standing still is not an option. As I’m sure you are only too aware, data volumes are growing for all kinds of reasons, not least to support current BI trends like Predictive Analytics or social network traffic. Yet IT budgets are under challenge like never before and increasingly the business, the pre-eminent commissioner of IT initiatives, sees their internal IT department as just one of several competing options for delivery.  To even stay in the race we must keep getting better at what we do!

The challenge then is to improve our Data Integration productivity – in terms of volume, latency and quality. To do so, we need to make sure that daily routine and conventional wisdom don’t blind us to what’s in the data – we need Process BI to inform some clear thinking about our own processes. Just as importantly, we must put this Process BI into the hands of the operational process owners to action the changes needed. Otherwise significant operational improvement simply cannot be achieved within constrained budgets.

As Steve Levitt said, there’s an abundance of data but a shortage of thinking about data. We can change that in our Data Integration corner. We have the tools from Assertive and from Informatica.  We have an abundance of data and we have never had a more pressing imperative to take action. I think that it’s time to get those thinking caps on!

Submit a Comment