DATA NATURE BLOG
DASHBOARD, DASHBOARD -
How long will you last?
By alex barakov
January, 2022
DATA NATURE BLOG
DASHBOARD, DASHBOARD -
How long will you last?
By alex barakov
January, 2022
A futurism day
A long solo vacation is a perfect time to put a great deal of thought into issues that have been missed all the year. For the last 10 years I have been dealing with dashboarding. This implies lots of other aspects including data engineering, business analysis, design, consulting and even data governance, but ultimately my teams and I have been creating dashboards for business purposes. We built thousands of them, as many again were given a deep-six, resurrected and eventually buried). Even if we haven’t reached perfection, then at least we’ve approached it. We’ve built processes, checklists, style guides, ended up with optimal templates, we know how to understand hidden customer needs, how to implement reports into processes, how to train; made a lot of mistakes and learned plenty of lessons from them, all in all, we can see right through the tasks and recognize the signs of fails before the start. Except that we read few books – mostly pictorial ones, surely, one could do better.

And as is often the case, one arrives at a benchmark when the visual experience in some area comes to a head and generates in one’s mind first a sensation, then a hypothesis, and eventually a belief. This is how my views on the dashboard evolved - the dominant bi paradigm of the last 10 years, which has become a sort of an absolute idea for BI tools, the edge of the final value, which everyone should keep an eye open for and enjoy owning a pricey system. A BI system for a large company with, let's say, 2500+ users costs .. good money, expressed in round numbers, let it be $800K per year ($300K for software and $500K for a team) - someone spends just under that sum, others - much more. In order to recoup these costs, you need to generate a lots of business benefits, in the literal sense of the word, a lot of cases of application acceleration, as well as improving solution quality and density, saving employee time, etc. (it’s the subject of another post).

And now, you’re leaving for vacation and thinking over what and how you’re to do better next year so that you could believe in it yourself and encourage others. Turn on the f#cking futurist. So, it is no longer possible to believe in the dashboard, as well as in BI higher purpose. Today it’s common to say that BI we know as well as dashboards are dying - it sounds fun, but this is clickbait vendor nonsense of vendors and it always ends with self-promotion. I’d like to puzzle it out and get the entire picture, gain new ideas or lower expectations rather than let the shit hit the fan. So, let's slow down. Here we need a fucking plot.

*do not try to find any connection between the picture and the text, there’s no any
Simulated happiness
In the old days dashboarding came into use as a form of communication between a business user and data and was considered as an advanced technology, actually replacing spreadsheets (they are excels for simplicity) - a breakthrough lied in beauty, interactivity, sharing and auto-updating. There was also a breakthrough in Self-Service BI adoption - making BI decisions in the company, both for ready-made reports, and users’ individual solutions. But then the growth stopped at the levels of ~ 25-35% of the target audience. There are several studies made by the Eckerson Group, BI Survey, Gartner, Thoughtspot on this subject - compare with your measurement - my data and feelings matched.
For instance, there was a good phrase in the Gartner that self-service technology turned out to be useful only to the extent that users were able to serve themselves.


Since dashboards were taken seriously in the corporate reality, there have been simulations and illusions of their usefulness apart from real benefits, as well as distortion of user workflows, unconscious by all means. We ourselves want to be deceived, since our dashboards are so appealing. Here are some examples of such phenomena:
  • Leads astray... - the users waste time trying to understand what the long, beautiful report tells them, rather than what they are really interested in;
  • Clients can promise a lot, so what?.. - users are always ready to give comments and suggestions, but that's not to say they plan to use this report in their work.
  • Borrowed solutions can’t be trusted... – many users harbor mistrust of something not done by them, and this feeling turns out to be insurmountable. This is woe of all reporting factory services;
  • Ah, here I have to start thinking ... - people tend not to use reports after development, even if they actually turn out to be useful. The sheer fact that the necessary report exists, calms down, yet there is no time to log in, open, filter and find insights, everyone kills current business tasks.
  • Any task = dashboard - the user will always prefer a dashboard for any task, slides for presentations or ad-hoc for hypothesis testing. Each time we make a board, we multiply visual variations that are not used for long, since users switch to the next new task and generate a new board, and so on.
  • The board itself works towards congestion, like a samurai strives for death – trying to provide the user with flexibility and variability, we make complex "anti-dashboards" by adding +100500 filters, switches, while reducing its understandability, insight, speed. Up to a certain point, this can work to update the adoption report, but ultimately kills the dashboard in it.
  • Self-service is not an unequivocal advantage - everything that users do cannot be attributed to benefit. Most of it is samples and "abandons", that create an illusion of an impressive mass, but have a zero value
  • Pricey bustler... - a traditional perverse cycle of an analytical task - there is a request for a report, this is followed by available data analytics, their additional processing, board design, logic refinement - and when the manager's request is responded with a bi-report, the request has already been changed, refined, supplemented, replaced by a new one. A stubborn analyst does not give up and tries to "catch the last ride" - clarifies the requirements, reworks, bothers other experts, adds data. This keeps things moving, which is rarely followed by a decision support, or the decision made are small-scale compared to input cost. Sometimes this hustle imperceptibly becomes a goal in and of itself on the way to the abstractly "desired" perfection of logic and visualization.
Here, I combine both analytical and operational dashboards (fair point by Roma Bunin). That is so, let's sort it out a bit later.

Summarizing these disparate outlines, I would highlight 3 main reasons for stagnation in adopting BI:

  • a fundamental information gap between the user and reports - users as a whole don’t understand (don’t want to go deeply, cannot do it, don’t have time - no matter) diverse corporate-wide reports are massively lost and users are not ready to make their own ones for a number of reasons. Within the dashboarding paradigm, this problem is solved through (1) improved personalization and targeting of the ready-made analytics delivery, which immediately drives up the operating costs of the BI team for support, training, consulting, (2) decentralized development and sale of SS BI ideas. We experiment a lot with user experience in this respect, creating customized ready-made role-based analytical workspaces with reports, all of these deliver a positive result, but still, it’s a huge and underestimated concern.

  • impromptness of dashboarding, its inability to keep up with changing business issues, both in the self-service format and in the reporting factory format. Even a well-honed dashboard production process takes days and weeks (whatever they say about hours and minutes), this was a breakthrough 10 years ago, but this is a super long iteration for today’s agile world. That was made particularly clear in 2020, when COVID-19 constantly changed the situation and businesses had to respond to new challenges. While we were working out one version of the dashboards about remote work, going to offices and vaccinations, another one with different requirements and data became necessary. In terms of BI self-service, we saw how a caste of power users stood out in business, including the very analysts with BI skills who are simply distributed in the company in a decentralized manner. But it's still a buffer between the decision maker and the data. This buffer remains and involves unavoidable losses in efficiency and globally doesn’t solve the gap problem in the value chain. As a result, a pixelperfect dashboard loses out to non-dashboard solutions - "knee" selects and charts based on current data.

  • lack of insightfullness - even a dashboard that is intuitive and promptly delivered does not generate insights by itself. This always requires business users make resource-intensive extra efforts. You can increase the insight capacity of reports, but there is still a "user's last mile" – have a look at a graph, a number, a color highlight and "pick up" on something valuable for making an effective business action. This part is almost always obscure for BI report factory teams – as two powerful cognitive biases collide here: (1) The business customers overestimate the benefits and don’t want to admit that they do not often use the correct report, which was also developed with their involvment. This can be interpreted as their own incompetence. (2) A BI developer gets emotionally connected with the report during the development process, and as a result loves it more than it really deserves, the developer may not notice its out-of-dateness and irrelevance, blame users for low usage rate. I often notice that we overestimate to some extent even the best of our reports.
*there is also no connection with the text)
Analysts are moving into
the world of BI notebooks
Dashboards, as previously mentioned, should be divided into some types. For simplicity, we will single out three types, as it seems to me, the most contrasting: strategic, operational, analytical:
1 - Strategic - a quick state and triggers overview for decision making; focus on high-level indicators; ease of use; interactivity tends to zero; no or hardly any filters; audience - tops;
2 - Operational - metrics monitoring in various sections, comparisons, dynamics; provide more information, more complex and require deeper insight; easy-to-understand proven user scenarios; a wider audience of middle-, low-level management;
3 - Analytical - require additional context; difficult for perception and insights, more comprehensive analytic facilities; non-obvious user scenarios; highly interactive, lots of filters.

So, the first two types can now be combined - both are about in-process monitoring, just at a different level of generalization, there are differences in design approaches, but the goals are the same - to display information, speeding up the decision-making process as much as possible. The third type is a total mistake and our BI pridefulness - to build dashboards for everything. We all seem to have already understood that the purpose of data exploration is inherently vicious for dashboard-focused BI tools (such as the holy trinity of scoreboard, powerBI and click). We feign that this is not the case, create "Frankensteins" from reports with a bunch of filters and switches in order to maximize its opportunities for unpredictable user research scenarios. Vendors are promoting research features for casual users (like web edit) and wrangling tools (like tableau prep) for similar reasons. But the truth is - this is too little for analyst, and too much for manager. Well, no, some of your colleagues using it ad hoc do not disprove this fact.

Also, we can notice rapid spread of notebook-tools, which combine convenient code-based (sometimes drag-and-drop) querying in sql / python and ad-hoc data visualization into charts / graphs. Such a "BI notebook" contains up-to-date data and context - everything you need to adapt and change analysis, visualize, describe, share and check someone else's at different stages. These conditional "BI notebooks" have put points on the board, entrenching upon a good part of analysts' time spent on BI tools. This idea is specified in the articles here and here, including, however, Count tool ad - probably a really good one (an attempt to humanize the old school thing like jupiter and zeppelin). Will data wrangling notebook kill the scoreboard and click functionality? - nfi, but there’s more vim in it!

So, BI notebooks assist analysts in their work and help to examine data by extracting fast responses to questions, but this is not an interface for a decision-maker. And what is ultimately going to be the dominant analytical workflow for decision makers?
In fact, the new BI is aimed at eliminating a temporal and conceptual block between the decision maker and the insight-decision-action and:
- seeking a new form of self-service, eliminating not only coding, but also data search, drag-drop development, slicing and part of data analysis, and possibly even reducing the self-serve component itself to zero as a source of unnecessary losses,
- combining the advantages of an agile and variable BI notebooks and pre-configured, personalized dashboard.
*Do not try to find any connection between the picture and the text, there’s no any
New BI forms – what will supersede them?
No use wondering - BI vendors have already got it all for us. Undoubtedly - all the fancy ideas borrowed from the latest conferences - Augumented analytics, NLP search AI based search, insight generation. Many of you have already seen Gartner's overoptimistic forecast stating that by 2025, 75% of data stories will be automatically generated by augmented analytics technologies.

The only thing is that implementation of these features is still uncertain for the developer and user. It feels like vendors themselves do not understand the new workflow yet. BI teams don't see any use or value in raw AI services, businesses don't care, and the problem of poor data quality multiplies all test attempts by 0. There are lots of questions. But first, let's start with the most palatable conclusion: dashboards are not dead, they did not even live a full-fledged life (as you and I wanted). The level of required personification, variability and efficiency of analytics will remove pre-configured dashboards as an unnecessary, pricey and sometimes harmful middleware that predetermines user experience in a non-optimal way. Dashboards will not die, rather they will fill a small niche of standardized, time-sustaining managerial reporting with stable and unalterable form and logic. The resource and focus of analytical teams will shift to new BI forms - conditional applications-constructors generated by the user and auto-generated for the user from different entities. What will it be? - here are some possible unique features of the new BI:

  • New interface with customer grade UX so that non-technical business user could navigate to insights (the topic of pinboarding is covered below);
  • Google-like search with a smart and self-learning engine using live-query and without pre-aggregation - will generate its own and display somebody else’s charts upon user requests in natural language;
  • Text and visual AI-generated insights - narratives, highlighted areas for focus, potential insights in the text form or represented by special charts and colors;
  • Catalogs of one's own and others' search artifacts and further finding processing through task-tracking systems;
  • Normal "smart" alerts - the current out-of-the-box options didn't really take off due to their primitiveness, manual drive and use restrictions;
  • Falling into BI notebooks to get into real data exploration, logic and lineage review for those who have enough date literacy;
  • New communication media around BI findings with tighter integration with corporate messengers - similar functions are not used in current BI solutions mainly because there is nothing to discuss - insights are very deep in charts, and communication in the BI tool is non-native to the user;
  • Live connections between the BI system and data storage databases (cloud, as a rule) - the delay in actuality will be reduced to zero or a minimum by streaming DWH solutions. The effectiveness of the NLP function will be increased due to "friendship" with data models of key vendors (AWS, Bigquery, Snowflake, etc), as well as new connectors;
  • Conversational BI - corporate chatbots and voice assistants - the next move rests with NLG in refusing self-service user actions, which is even less ready by now, but logical.
This concept is well interpreted by thoughspot, which is not overburdened with an expensive legacy product and immediately create something new, going by other persons' dead ends. These guys starkly lash out at traditional BI (meaning tableau, Qlik, PBI, etc., how fast-paced the world is), but this demo impresses with well-thought-out user experience. I liked the term - pinboard (you are really great with marketing - start with your naming)) is also a board, but formed by the user through saving elements (charts, insights, alerts, etc.) generated by the search engine and derived from contributed work or manually built solutions. Here, the user’s movement through drill-down / drill-up scenarios also look native to human-level reasoning. A similar concept of contextual analytics is being developed by Yellow Fin BI.

In general, an approach involving minimal presetting, flexible panel construction from elements and a universal Google search interface seems simpler than current self-service scenarios and more promising in terms of involving a new "sleeping" audience in the business management world and a new breakthrough in self-service BI adoption from 30 to 60-70%. The problem of laborious slicing and preliminary element placement will have to hit the dusty trail - it is too time-consuming, and herewith everything is already clear and templated in it which allows to eliminate manual labor. Users in fact can design pinboards for their needs on required subjects with the pre-defined complexity, assembling them from ready-made elements.
If it’s too much for you, then you’ve still got the scenario of using ready-made role-playing pinboards (we mean "dashboards" here).

Everything we are talking about can be illustrated by thoughspot:
(let's take it not as a part of advertising - I can’t say this tool is perfect, I just don’t have a full picture of it)

*and here
Things are worse for inertia-type leaders - Qlik, Tableau and PBI also invest in these ideas, but their results, in my opinion, do not contain clear and unified interface connecting the NLP service and insight generation with the base product. Dashboards and AI service are used separately. Work within middleware in the data tier (like with tableau and Qlik) is likely to greatly hinder the technology take-off - building the entire binding with its own published data source seems to be a severe restriction of the same scoreboard. PBI insight generation now looks like a peck of text describing deviations, most of which are low-priority, and the textual form withdraws all the advantages of visual analysis and turns the user into a reader (a sound idea by Dima Gudkov).

While the vendors give themselves another 5-7 years to finalize and train semantically "dumb" NLG engines, let's imagine a transformation model for the BI service itself, processes and teams, while onboarding this approach.
Obviously, the rise of AI functions will contribute to the business users autonomy and reduce the flow of ad-hoc requests and report development requests on data / BI analytics. This will release the BI team's time and labor, where should the focus be shifted to? There is a feeling that the BI team will pay particular attention to the product and all kinds of support for the AI engine learning process for data recognition, search efficiency, and user interface customization.

Here are the examples of activities that, in my opinion, will have a priority:
  • Taking work with metadata to a super-level. It will require resource transfer from data mart and report production to the data source and report adaptation for NLP - adding tags, synonyms, formats will be vital to ensure the quality of auto-interpretations.
  • Adding more and more data to the list of available and understandable for AI engine, setting up integrations with databases, maintaining and ensuring correct operation of the connectors
  • Teaching users to work with new interfaces and search queries, as well as increasing confidence in the service
  • Configuring and maintaining alert and insight models - no understanding of the target administration model for these features, but this story seems to require a manual audit
  • Assembly and support of ready-made key "pinboards" for direct use or customization
  • Certification of individual charts, pinboards, datasets for top-priority display and garbage removal
  • Development of separate "complex" reporting models with special logic that cannot be generated by a machine. Examples include cost allocation, headcount forecast, specific candidate funnels, etc.
  • Independent data analysis, insight-hunting - gleaning business consulting experience
  • Work on data quality - muster strength for a new battle

Should the BI tool be changed? – it seems not, at least for reasons of BI evolution. Everyone has similar views and projects now, still there are some differences in their implementation, but it is difficult to say who will be the first to assemble an operable concept. Probably the people's rating of systems will vary with new realities and the customers will change into new and daring players. So, if you choose a BI system and are not yet bound by vendor prejudices, look closely at thoughtspot among “grown-up” systems (not sponsored, not investment advice)) or at arria (NLG extension for PBI, Tableau Qlik and others), or at recent and niche solutions like narrative bi, Outlier, Lexio, datastories (spoiler – they’ll all be bought soon), and I’ll be waiting for new revelations from the thaumaturgic powers of tableau.
What can be done in advance? Work, as is commonly known, fills up all the time allocated for it, and if a resource is not bugged, there won't be any progress. Options:
- Make a roadmap in order to drastically solve critical data quality issues for the company (employee, client, candidate, costs, payments, shipments, reserves, etc.). If there are gaps in this data, you won't leapfrog to the new BI. It is not too late to tackle this problem globally - to deploy a good DQM solution.
- accumulate a base of insight cases, whatever the new BI system capability is, your team and you will have to configure its adequate insights in one way or another (fthus, you need to maintain a base)
- Invest in the Data Stack modernization - Data Warehouses, Data Lakes, Lakehouses and other Data Mesh)) - Outdated solutions will block the BI upgrade
- set about refining metadata and debugging its support processes
- shift from BI financing as a cost center to a new model of BI funding and efficiency metrics tied to revenue and monetization of reporting - the number of dashboards and reports will soon finally put everyone's mind at rest.

All these arguments may make someone feel a strong need for changes or on the contrary, nugacity of new trends, but it is futurology, useful to form various transformation concepts, but not much else. A better understanding of current tasks, restrictions in your company and your current tool capabilities will always be more important than new features, which are still a long way off. While Bi should deliver benefits here and now. So, God Save Dashboards)
another related posts
Contact us
Delft, Netherlands
Contact Us
Delft, Netherlands