Data, requests, charts, requirements, more requirements, more data, in the end, release and already a new project, new report. We put in effort into dashboards during development, user acceptance testing and switch to new tasks, and then with sadness we discover the sad usage statistics.
I also noticed among BI analysts (including myself) a tendency to be defensive about our own vision of how a specific dashboard should be organized, and to dismiss user feedback as incorrect if they don't understand or appreciate its capabilities, design, etc. Individual negative feedback is often brushed off as insignificant and the report developer may not realize that the report is not being understood or accepted by users on a large scale. This is a form of cognitive bias among report developers. It's easy to say "Users always want their own Excel, what can I do?".
In summary, further reflection will focus on implementing CustDev practices in dashboarding, specifically in obtaining feedback throughout the lifecycle of analytical products, and more specifically in configuring the feedback flow to improve report adoption.
CustDev practices have proven effective in product development - any user-facing internet service, mobile application - they all put a great deal of effort into configuring user feedback flow, analyzing customer funnel from first contact to departure. For such products, any improvements in this area have a direct impact on financial results (through classic metrics like ARPPU, LTV, etc.). However, CustDev ideas are not widely implemented in BI projects, despite the fact that treating a report as a product and the user as a paying customer would be a logical approach. There seems to be a certain pride among BI teams in not wanting to adopt a product-based approach.