Our design model

Our design model

So my head is spinning more than a little after this week. I’ve had a data strategy meeting with whom and for what purpose I genuinely do not know. Then three or four separate data model development meetings then reviewed my teams work… All of this has made me feel it is time to share my design philosophy and how I go about building Power BI reports.

I use a four step process and for my personal opinion that fits the product best, the key thing to remember is that you do not always have to go forward, I’m a huge believer in emergence and emergent behaviours, BI is a prime emergence environment

Emergent behavior is behavior of a system that does not depend on its individual parts, but on their relationships to one another. Thus emergent behavior cannot be predicted by examination of a system’s individual parts.

In other words as reports develop and value is shown addition relationships or correlations become clear. This understanding is what drives my design principles, hopefully even if you disagree you will appreciate some of the ideas and can incorporate them into your own modelling principles.

By dividing the report creation process into four distinct stages expectations and delivery gateways established, typically you should try to sign off each stage before moving forward, but you have to accept the reality that emergence willl often move you backwards, that is fine and must be accepted, pushing on regardless to the recipie for disaster.

Read more

Going with the flow & Trending Power BI Metrics

Going with the flow & Trending Power BI Metrics

This is quite common issues as you start to go forward in Power BI, so I’ll try and scope it out for you guys.

A client has five or six data sources coming together to produce a Power BI report, due to the nature of the data sources they return a “now” state so trending is  a challenge, typically to set trending up we would set up a snapshot. In this data model data is coming in from a range of services not all of which are in SQL some data is coming directly from products as well. The net result is that snapshotting would be complex and ultimately remove the benfit of using Power BI and the Mashup engine. Instead Patrick spelled it all out in a Guy in a Cube Video. Now we didn’t need to do all of this, but something went “ping” and suddenly I had a vision of the future. We could use the integrated SharePoint with the App space to host a SharePoint List and then use the SharePoint List to bring the data back.

That’s dumb, why would I want to do it all this way, SharePoint isn’t going to hold the data I need!

The biggest benefit of using this method is that quickly and easily trends can be set up based on complex “unrelated” data.

Watch this space as I’ll keep adding to it as we find things out about using the mechanism after all there isn’t really a short cut to setting up a trend and getting real world data.

Getting active with Active Directory

FitbitSO reality hit yesterday, we needed to import the data from all a clients Active Directories – not many by any standards and all 2008 I still have nightmares about uncovering forgotten NT4 domains that are “business critical” – anyway you can use Power BI Desktop to easily import multiple Active Directories and then merge them into a table for reporting, that all works really well and is frankly a breeze. You say you want to use an Active Directory, give it some credentials and then select where and what you want to import. If you need to combine data then you just merge your queries.

So why are you wasting our time by making us read this?

It all falls apart when you start to set it up into an App ready to publish it. The Enterprise Gateway (On-Premise Gateway) doesn’t support active directory connections, so you have two options

  1. Hire someone to manually refresh reports all day – This is a great fun job and I’m sure your staff retention will be sky high
  2. Use a single Windows account for everything (that needs Windows accounts) including logging into Power BI and all your data sources – I can hear the panic in anyone with a security background reading this

Well neither of those options work for my company! So we chose not to choose naff, we chose life instead

Read more

My Ultimate Date Dimension! at least until I figure out a better one….

As you progress in Power BI (and Power Pivot), you quickly find you need a date dimension, the problem is that date dimensions are difficult to build and even more difficult if you want them to be more than just a list of dates.

“Stop I’m confused… how can a list of dates be more?”

The role of a good date dimension is indeed to be more than just a list of dates, there are functions in Power BI to produce a list of dates, the issue is that you want something that is more. I’ll be honest at this point before we dive into the queries, I use, I cannot remember where I got the original source, I found three or four different date dimension SQL queries initially and then started pulling them together and adjusting them to fit my needs. For me we were going to have the majority of our data coming from a SQL server so it makes sense to have a SQL date dimension, based on what we did and after doing some benchmarking within Power BI I would recommend housing it on a Database, putting it in Power BI or excel won’t stop things working, but I have got best results from this method.

The important thing about the first bit is that there are many, many scripts out there to do what you need. Typically they have the same format.

I’ll make these source files available to download

  1. Define Table
  2. Populate Dates Dimension
  3. Deal with 1 character days and months
  4. Set Fiscal Year Months – where I work we use August to July so I needed to add a case statement here, case statements sound more complex than they are
  5. Set Fiscal Year
  6. Set Fiscal quarter
  7. Set Fiscal week

Read more

Transparency–reports become the pane of glass

hc-modernize-gui-article


The journey to becoming a data led organisation through the implementation of a data centric reporting structure

When you ask yourself the question “what does reporting mean to me?” It is a highly subjective question. The answer is of course that reporting means different things to almost everyone in an organisation, Team leaders will use it monitor and manage their teams, while C-level consumers will use it to validate and assess progress along strategic goals. The promise of Big Data and in many ways the biggest value add that it can bring to any business is that suddenly it is possible to have everyone accessing the same data ideally through the same dashboards.

The idea of transparency scares a lot of people, but the bigger question is of course is it right to be afraid to be transparent internally? Under no circumstances am I advocating a showing customers a warts and all reporting window, but internally we must. All too often strategic goals have been created or devised using pie in the sky highly spun tales of business performance and then success is equally measured on these spun metrics. Therein lies challenge #1 of moving your organisation down the transparent route – Performance Tweaking.

Performance Tweaking – The act of presenting your performance by excluding “outliers”

Removing outliers or areas outside your control is internally – for many organisations – standard practice, however if you look at what can be done when a corporation reports financial  data you’ll see a radical shift – transparency – a business cannot just exclude bad transactions and only report on successful ones, they can classify the bad as such but still it must be accounted for. Day one on your journey to transparent reporting will no doubt involve a good deal of this – assessing what goes into metrics and how to balance the view, a team leader will want to see everything but they will not want “everything” to be seen by a C-level.

Challenge #2 – Fear – “What would happen if a C-Level drilled down to my team to see what they’re doing!” normally disguised as “Senior managers don’t have time to look at all the data so why make it available?” The goal of transparency is that absolutely everything should be available to everyone, and make no mistake like any new tool/toy it will be experimented with when it is first received leading to some very difficult – but ultimately worthwhile conversations as alignment begins be achieved. Make no mistake however after the launch and the first couple of weeks the toy becomes a tool and a tool is there to be used. The fear however that someone else of another manager or even worse someone senior will know more than “me” is a terrifying one. Traditionally businesses run internally on a currency of Information. The information I have about how my team/department/division runs enforces the traditional hierarchical management chains and serves to enforce practices that in the information age can ultimately be counter productive.

Fear – The fear that others may come to know more about my area of responsibility than I do

Challenge #3 – the unmentionable – the third challenge is often the most difficult, it is characterised by the throw away comment of “this is too complex for me” and “I’m not an analyst how can I be expected to understand this?” The reality is that in many ways this is one of the biggest enemies to progress, the view that a big data tool, must somehow be too complex for any “normal” consumer to understand. In truth this is proof of the requirement to have a clear development model with a consumer focused design philosophy, your single pane of glass must have a clear and simple layout that is both intuitive and self guiding.