Something happened on the way to Analytics

I just spent the week onsite with a client’s business intelligence team. Apart from the amazing cultural experiences after each day’s work, the experiences of working with a corporate technology team gave me hitherto unseen experiences from their perspective. The client was launching their new ERP system, and were starting to building an analytics system on Azure that would service the needs of the business users with information from this new ERP, and other systems.

Working as part of a BI technology solutions partner in the long run morphs your way at looking things: technology first, trying to fit client needs to the types of solutions we do best; assuming that all tech people will be comfortable with the next best thing in technology; assuming that business users will always know the first thing about BI; Your architecture is always good.

Technology first

Despite knowing, understanding, and architecting a solution for a client’s business needs, the zeal to implement the solution using the latest and greatest technologies is overwhelming and is even “believed” to be the need of the hour. Very often the solution itself is pre-designed in my head with new technologies.

Let’s say that an organization has indeed gone ahead and invested in a new platform such as Microsoft Azure as their initiative to put everything on the cloud. That should not be an automatic signal that they are ready to adopt everything Azure has to offer despite those being part of the package. They would rather work with the technologies that are closely related to what they are familiar with. So when I jumped in all guns blazing yelling out “Data Lake, Data Lake and Power BI, Power BI”, I should have rightly expected them to have put up their hands and say “hold up!”, so it was a good thing that I didn’t even whisper “Databricks”.

Being a consultant and evangelizing in technologies is good, but what is important when consulting with a team which has been working on traditional technologies for quite sometime, and also having had success doing it, the need for talking them through the paradigm shift in the way we do analytics, and guiding them to the new set of technologies plays a vital part because “they are not a test-bed for Microsoft technologies”, and can be easily misconstrued as such if I seem to be forcing different technologies down their throats…

Names are misleading

When Azure SQL Data Warehouse was chosen to implement a multi-dimensional data warehouse, it may have seemed like the ideal choice. Why? because it was plain to see: keywords: “SQL”, “Warehouse”. However, no, SQL Data Warehouse is ideal only when you have data loads that are quite high, not when it is only several 100GBs. Armed with a few more reasons as to why not (A good reference for choosing Azure SQL Data Warehouse), I had confronted them. But the rebuke then was that they did get good enough performance, and that cost wasn’t a problem. Until of course a few months later when complex queries started hitting the system, and despite being able to afford that cost, the value of paying that amount did not seem worth it.

This is not where as a consultant I should go in saying “I told you so”, rather this is where I acknowledge their errors, suggest alternatives, and work with them at looking into alternatives that they have come up with and plan for change (such as Azure SQL Database).

BI what?

After all of this, it finally came to which business intelligence tool should the business users be using. I “knew” the obvious choice “was” Power BI. However, the team was not convinced when I told them that it was easy to build reports and dashboards out of Power BI. Sure, and so was creating reports and dashboards using the new Cognos, since they all knew Cognos so much. They even brought up the tool that came with their ERP, despite not having grouping in reports, and being demonstrated by the consultant as “easy to build”. Yes, for each person who was familiar with a particular tool, building the report will be easy was the lesson of the day. Hence, it was decided to go about a fact-based approach, and not opinion-based.

Then came the interesting bit, just by chance, during the fact-based debate, when I asked the question: “What type of reports will we be running for the users”. The answers all started pointing towards operational reports, and ad-hoc reports. The ad-hoc reports looked to be analytical in nature, but provided in the form of a flat Excel, or a tabular report. The operational reports, were all pages of tabular reports. Further questions digging into this revealed that with the new ERP launch, the ask was for more and more operational reports. And whose job was it to churn out these reports? The BI team’s of course! A cringe-worthy moment.

Months after consulting with them, for the first time, after they had come down a path of business intelligence, that too on Azure with its many options for providing analytics, they were writing heavy queries for operational reports because the ERP was only so good in providing the reports that were needed, and “naturally”, as a lot of business users expect, the job had fallen on the BI folk to supply that demand. Heavy, complex queries were hitting the star schema on the SQL Data Warehouse. The results were then dumped on to flat Excels, and the Excel-comfortable users consumed them, and sometimes, interestingly… wait for it… <insert theme to Jaws> performing their own analysis on top of them because all what they wanted was not given the way they wanted it.

Back to the drawing board

It was evident with all the BI initiatives what the users were currently in need of were operational reports. Additionally, they also wanted ad-hoc reports that were presented in a tabular form. It was evident that the architecture that was adopted was not the right one.

Current Architecture

It was soon decided that the team falls back, perform a reality check, and serve the users’ primary objective: operational reports. The architecture will be altered, but with less effort to service this need, and with Cognos since it would give printable reports, and the BI team was already familiar with it. Everyone understood and agreed that BI was not the need of the hour. However, the need for BI would come soon. Hence, it was decided that most of the ad hoc reports will be built in an analytical fashion to Power BI to tease the users with what was possible over the Excel-based method that everyone was used to. Excel-based analysis will still be given, but through Power BI datasets.

First Iteration of the New Way Forward

Crude, you may say. But this was most agreeable. Users will not be overwhelmed; the BI team will not be overworked; technology will not be given priority. The need of the hour will be serviced, and the need to come will be slowly infused into the users’ DNA. Since this is Azure, it will allow for an initial architecture, with less spend and less resources, and allowing the BI folk to iteratively build an analytics platform with an iteratively improving architecture, on the road taking them on their BI journey.

The Analytics Pane in Power BI

The August 2016 edition on Power BI Desktop introduces a new pane named “Analytics” right next to the “Fields” and “Format” panes. Now, this title probably gave you, just as it did for me, a racing heart and goosebumps. However when you actually go to the pane, you just blurt out a disappointed “oh…”. Well, not that it is bad, you have options such as adding a percentile line, median line, trend line and a few more. But one would expect to see a little bit more with respect to analytics.

One thing that you do find is a forecast line, which for now works on a single-measure line chart. It’s pretty neat, but as advanced analytics go requires enough data points in order give you a good forecast. Of course this is just the start. We’re sure to see more analytic capabilities in the future.

The BI and Analytics Magic Quadrant in 2016 – Power BI rules!

The Magic Quadrant for Business Intelligence and Analytics Platforms was released a few days ago by Gartner. This report, released on an annual basis analyses the vendors of business intelligence and analytics, and places them on a quadrant to indicate their capabilities. This year, well, things are different. Many of the key players from previous years have fallen away from the Leaders quadrant with only 3 remaining — Oracle is not seen anywhere in the four quadrants (they did not qualify to be included based on criteria)

Gartner2016
Source: Gartner (February 2016)

The criteria for this latest assessment is based on 5 use cases and 14 critical capabilities of a BI and analytics platform, which mostly focuses on agility and self-service.  Gartner explains that the trend of BI and analytics switching from an enterprise reporting model to a self-service model has now reached a tipping point, and now for the first time Microsoft is seen as a visionary leader in this space. And for Gartner to base Microsoft’s assessment solely on Power BI goes to show the potential of the product. The second iteration of Power BI with it’s desktop module and the online portal, offers an intuitive and simple to use interface for users to build data discovery and visualization solutions. With support for a plethora of cloud-based and on-premise data sources, along with up-coming features such as Cortana-integration, I think Microsoft is on a good path towards what Gartner predicts how the BI and analytics landscape will look like by 2018. Polish up a few cautions indicated by Gartner such as low advanced analytics capabilities on Power BI, and they’d look even better.

Microsoft also had been in the Leaders quadrant for the last 9 years, during the time when enterprise BI was at it’s peak. Couple that with the latest assessment and you could safely say that the collective Microsoft BI stack is a force to reckon.

For reference:

Gartner2015 Gartner2014