I’m still here

Wow has it been three months, I’ve been _20160621_001133.JPGinundated with work and have to say my blog has suffered because of it. In summary over the last three months I have…

  1. Produced the final version of our Excel reporting tool
  2. Begun the evaluation of reporting tools
    1. Tableau
    2. Power BI
  3. Completed Tool Evaluation (it wasn’t actually that difficult, quite a major differentiator between the tools)
  4. Oh yeah and run 69 miles – I know I’m such a child, that really was a lesson though. I learned that sometime childish jokes can be oh so painful

Confirmation of selection is now what I’m after, with that I can start to really enjoy the implementation phase.

CSI is still the thing that most organisations I meet and speak with are bothered by. CSI (Continual Service Improvement) or Kaizen is not about everything you do actually being better. It is about understanding the difference and then selecting the most suitable, so gradually you tweak performance and delivery. For example – this is so British – when you make a cup of tea should you put the milk in first or last – one does not take sugar with one’s tea? It would surprise many to hear that there is a chemical difference so a subtle flavour difference. The reality is in this case its pesonal preference, but in the quest to find “Your [personal] favourite cup of tea” you have to try both, then select the way you like more. The order you try them is not relevant the relevance is the comparison.That is just the same as “If we do step x this way or step x that way which is best?”

The whole idea that bad can be good blows senior execs minds. They just cannot accept it, but really it makes perfect sense. That’s the other side of my job anyway. I hope you can now see why I spend so much time with reporting – hiding under a desk – its because I in order to know if A or B is best we have to be able to compare and that means report on the two states.

32 Bit vs 64 Bit

I want to put a quick post in here because I released yesterday that I had been being a dense idiot with this. I’ve been around IT long enough to know how it works but I still fell into the trap so please pay attention to this.

32 Bit Office means it can only access the first 2GB of RAM on the system NOT up to 2GB per 32 Bit process. I spent over a week tuning my powerpivot sheet for Excel 32 Bit thinking “keep the RAM requirements under 1GB for launch and you’ll be fine”. Launch day came and tuning had kept requirements to 950MB (engaging smug mode) HOWEVER I started getting reports of max RAM errors. I looked into it and went back to my older school days of doing technical support and realised that it was things like Outlook (300 MB of RAM), Lync 2013 (250MB), other applications (500 MB). Suddenly it dawned on me, I had been suckered into thinking of that 2GB limit as almost portal.

The lesson is clear, if you have a hankering to use big spreadsheets and that then move to 64 Bit office. The old issues of incompatible plug-ins and macros may well still exist, but the difference of being able to use the 4+GB of RAM your system has make it worth it.

I’m actually amazed that we default to 64Bit OS these days and most systems have 4+GB of RAM yet we (and Microsoft) still default to 32 Bit Office… makes you wonder

Too many Dates?

Real world item often have more than one critical date, Orders will have an Ordered Date, Dispatched Date, Delivered Date etc… In Incident Management the core dates to begin with are Created Date and Resolved Date, but there may also be Status transisiton dates i.e. On-Hold/Off-Hold/Assignment Changes etc… I struggled trying to work out how to fit this into a “Narrow your data” mentality. In the end I tried a completly different solution in 2.0. I pivoted the dates in my attempts at PowerPivot godness I probably went a little too far – I couldn’t get the original idea to work.

Pivoting the dates wasn’t as bad for me as it may be for you, I was able to do it on the SQL server, I created a view from the original source of the tables, that view pivoted the data and then PowerQuery imports the view, drops the mic and walks off like a pro

SQL Source – I wrapped mine in a create view statement

SELECT [Incident GUID], DateSwitch.DateValue, DateSwitch.DateType
FROM TblAll_Incidents
UNPIVOT (DateValue for DateType in ([Created Date], [Resolved Date], [Closed Date])) as DateSwitch

I’ll break it down because I know I promised not to get overly technical. I’ll also share a secret – GUID, if you talk to DBA’s or System Architects long enough you’ll come across this term and they’ll invariably try and look down their nose at you if you ask what it means so I’ll tell you. GUID stands for Global Unique IDentifier. Simply put a GUID should be unique within the system – if not the whole wide world. They are 16 digit strings of numbers and are semi-randomly generated (I’m not 100% on the science of their generation) basically though when someone says they’re going to use GUID’s it means they’re going to set the Database up properly, numeric 1,2,3,4… indexes not only repeat for each table they also commit the biggest sin in database terms… the change width, 1, 10, 100, 1000.

For those old enough to remember it that was the worry about the Millennium bug “How will a computer cope if it has to put 100 (three digits) into the space for 99 (two digits)?” Millions if not Billions was spent so we wouldn’t find out the hard way.

The UNPIVOT section is going to get me into trouble, basically it’s going to turn a column into a row in SQL PIVOT would turn a row into a column, in essense though I’m sticking to what I said, I pivoted the dates using the UNPIVOT command in SQL. So we’re turning a column into a row, remember though at any given point in the query we’re only looking at one row. That means our single row will be slit into the number of rows you mention in the the UNPIVOT. Our Tickets have three dates [Created Date], [Resolved Date] and [Closed Date] so our unpivot turns our one row into up to three (not all tickets have been resolved or closed remember.

datePivot

Suddenly there was a way of really pulling the column counts down for the imports. The second step occurs in PowerQuery. When importing that DateValue column change the datatype to just be Date – you lose the time but the reality is for most reports you won’t use it. Retaining the time in the SQL though means that IF or in my case more likely WHEN senior management say “We want to know what time tickets were created” you can handle it

I got too keen when I first hit upon this and tried to combine all the dates from all the processes I needed to report – they’re still like that in the SQL, however PowerPivot (or more likely my skills) let me down. If you work out how to combine the dates from a single FACT table into a single “Date Pivot” please let me know!

Another day another village?

I am starting to really gain an understanding for how Dr Frankenstein must have felt, that terrible mix of pride at what he’d created and then horror as he watched it lurch off towards another village.

Today I recieve a Typical Request, this is the sort of thing I get used to, I’m so used to it now that that as you can tell I generally butcher my notes, be kind, normally there are only for me to read.

As a “Doer” (note I give myself a capital), I’m used to getting tasks like this, although if I’m honest I’ve done a pretty good job of collating this into something meaningful and in this particular case the customer had actually mocked something up… very unusual! This is however a great one to show you how it goes typically when I receive a request like this.

Step 1: Meet with customer
Step 2: Panic – What am I doing, what have I agreed to, why am I speaking to myself?
Step 3: Refer to Data Model, does it have the core data I need
Step 4: Create additional Measures as required
Step 5: Start setting out reports
Step 6: Meet with customer(s) and present results
Step 7: Rework as required (never ever underestimate this step, your first pass will invariably miss the point and not through anyone’s fault)

Step 7 is by far the hardest step for people to understand, this goes back to my time in Consultancy but is a hard lesson to learn and can be a bitter pill to swallow. Emergent Behaviour is not the same as people doing a “Lou” (See Lou and Andy), the reality is much more complex while at the same time being simpler – I’ll jack all this in at some point and become a motivational speaker – EMERGENCE. Emergent Behaviours are mathematically unpredictable without going through the steps. A great example of this is Langton’s Ant. So when you present to your customer/manager/trained/untrained imp a report and they say “couldn’t it do…” or “but I wanted…” generally what they really mean is “Holy Cow that’s blown my socks off, that reports shows me <Insert point here>, I wish I could dig deeper”. The Emergent Behaviour is that if you didn’t do the first draft/pass they wouldn’t know what they wanted. There is no way to predict the outcome without following the steps. Knowing that and releasing the stress of being asked to make changes is a vital step. Likewise, the frustration often felt by the requestor needs to be tempered (I find however reminding them off this rarely moves things forwards) hopefully one will read this an have that “Holy rusty metal Batman” moments (other super heros are availalbe!) and realise that in fact that first pass was necessary to help distil what it is they are looking for and to drive the key thing they actually want – An Invaluable Report rather than worthless jibber jabber.

Good hunting Starbuck!

 

I created a monster!

2.0

With the 2.0 version I redesigned everything, the Ticket tracking is now a robust and complex workhorse. With it anyone (even senior mangers – unfortunately) are able to crunch numbers like pros. The truth about this and of course the bit that I’m really interested, Problem Management and Continual Service Improvement is the those insights that you stumble upon are often the most critical.

I’m reminded of the story of the truck stuck in the Brooklyn Tunnel – I don’t know if it’s true but never let that get in the way of a good story – anyway this truck was wedged into the tunnel and the Fire Brigade were there, the police, highway’s etc… all with different ideas about how to “fix” the problem. A little girl happened to be going past in her parents car and wondered why they hadn’t just let the tires down on the truck to unstick it? Right or wrong fact or fiction the sentiment is true, often those left field solutions have merit and and having a tool that allows you to test a hypothesis and see if it is historically true before making the change and then running for a limited period and assessing if the hypothesis is true of false is incredibly powerful. We have the capability now to answer any question “The manager at the <insert name here> office says his site gets terrible service, is that true?” or “Does it take longer to get equipment in <insert country name>?” Now we can do all that and most importantly we can do all that without having to engage BI resources. It was PowerPivot.com – Avi I think – who put it in ways that I really embrace now. In fact it’s even now part of my Personal Development at work. BI is something for everyone now, it’s not about me controlling and dictating access, it’s about me bringing people into the tooling and working with them to help them access as much as they need to understand everything that is happening within their Service, Site, Division, country, front room?

Three months in and I’m so far loving it, many of the old CMDB development rules are true in PowerPivot (I may have to write them down again in another post). The journey continues….

2.0 A different animal

So I released the 2.0 tool yesterday after completely re-writing the whole datamodel from the ground up including redeveloping the SQL. There were some red herrings that I came across, but in general things have gone well. 25% reduction in file size while adding more data and some more intangible efficiencies. Look out for more posts shortly as I go through areas of the development.

To Pivot or not to Pivot

So, after the first proof of concept I am now zeroing in on the 2.0 release and I have to say it is looking pretty gosh darn good. Please remember that the journey of PowerPivot means that while I think it’s great now in a few posts no doubt I’ll be saying it looks like something a misinformed chimpanzee would produce.

The key to understand with PowerPivot is the storage model, this is a big thing and really is central to how Microsoft have devised a tool that enables you on your desktop to crunch the kind of data sets that previously you needed hugely powerful servers to process. As a caveat to this I will remind you all that even a low end desktop these days has more processing power than a high end server of 10 years ago, but that’s honestly not the point!

Microsoft have turned the data storage component on it’s head to ultimately align the tool to usage. “Hang on what do you mean align the tool to usage?”

Simple – when you write a query and trust me basically do that when you set up a pivot table you start with “give me the rows that match some column criteria from a big table of data”. Please don’t tell anyone, but that’s all SQL queries are! Practical example “Select IncidentID from Incident where status = ‘New’ will provide a list off all Incident ID’s in your Cherwell System that have a current status of New, so the columns I am asking for are just “IncidentID”, the rows is all that match the criteria and the criteria is Status = ‘New’. Is everyone still with me? Excellent

So now we’ve got a basic query you can start to see the traditional way a database works, rows are everything. You have to ask for columns because rows are always there, I will always get every row that matches my column criteria either for display (select part) or filtering (where part). A traditional database works like this think of layers of cake or if you’re a Garfield fan layers of lasagne – please don’t overthink it databases don’t have layers of fruit or mince, maybe I should have gone Shrek and said onions, databases have layers just like onions mmmmm Lasagne…. sorry I’m getting off topic. The layers of a database table, the rows are designed to be accessed together and stored together, so space is allocated on Hard Drives ahead of time and then used. This is great for write it and occasionally read it. The challenge now though is thaternest-ted even a basic laptop these days has more power than a database server of 15 years, the internet has also given everyone access to vast quantities of data. The result is that we want to be able to analyse it and to quote one of my favourite lines from Ernest Borgnine “People today have attention spans that can only be measured in nanoseconds.” – Basketball 1998.

We want instant gratification, you ask google (other search engines are available) and Basic Pivotinstantly you get an answer (not necessarily The Answer). Business has the same challenge so a tooling methodology change was needed. A query that we need to understand of crunch large quantities of data is more like show me [arithmatic (count or total of) something] for [these areas]. If that seems dumb let me spell it out, show me total sales by region per quarter. You’ve probably guessed art is not my strongest subject.

Here are a couple of links from PowerPivotPro to further help if my explanation sucks (doesn’t work for you). I’ll continue below…

http://www.powerpivotpro.com/2010/02/surprising-example-of-powerpivot-compression/

http://www.powerpivotpro.com/2015/12/compression-with-power-pivot/

Version 1 – PoC really

The problem with being in a busy, dynamic workplace is that you never really have the opportunity to fully take stock before pushing on. In the case of my first stab at producing a rich reporting tool in PowerPivot that was definitely the case.

I should probably go back in time a little, I guess you may (or may not – but this is my blog to tough) want to know where my passion for this stuff came about. A long time ago and in a different life I used to produce a lot of reports, since I’m ultimately lazy (anyone who does this stuff is, they just say they’re not) I wanted to create a self-service reporting tool set. Back in 2006 and 2007 that was really done by hooking Access databases to a true database backend and then using a simple menu system. It was interesting and dare I say fun designing them, that’s really what got me into VBA, from there I ended up moving into VBS and taking a detour from reporting and going back into where I started, break/fix and solutioning.

Wow, I’m heading off topic here. I may write some more posts on my progression through the skills. The point is that three years ago when PowerPivot was first brought to my attention I could see the potential but I couldn’t work out how to leverage it. So many people just don’t get Excel and for many pivot tables just might as well be in an alien language. So it took me a while to start to venture into the realm of PowerPivot, the biggest stumbling block was of course that my head was stuck in the old style way of doing things, both for Excel and SQL (Database development/structuring). A year ago I had to start using PowerPivot after a spreadsheet got well just too big. That’s not an ideal scenario.

So I had a reporting solution that was creaking, an ITSM project to implement and reporting to migrate. The perfect storm…. But I still didn’t understand what I’m afraid I have to call the Power Tools – PowerPivot, PowerBI, Power Query and PowerView (might as well throw it in, but really I consider it more part of PowerView – PowerMap), what works best where? How should I structure my data. You could say I chickened out and went with the “easy” option, I migrated what I had but split out some of the dimensions, really I didn’t know if it would work or even if I would be able to produce something that would come close to my vision.

PowerPivot1.0 Model1.0 is very definitely and a great reporting tool, I was blown away by it, although as you can see from the title it was nothing more than a POC, there are mistakes in the fundamental data model, the PowerPivot learning curve is steady but required me to unlearn so much that it took that perfect storm of a new ITSM tool to make me re-write all the queries and then re-wrap a presentation layer. The experience actually came together finally after a number of hours bashing my head against a brick wall, my Eureka moment was actually attending a Webinar hosted by Avichal Singh. Something went click and 1.0 was born about 24 sleepless hours later.

When I say born make no mistake the “Advanced Reporting Tool” as I call it is a living entity and like Dr Frankenstein watching his creation lumber off towards the nearest village so I had to watch my creation become something more than I had intended. Suddenly Senior Management were using the report and were asking questions about performance. You see I not only gave access to summary information but also allowed them to go down into details. Suddenly they could access a wealth of data at their 1.0 Tab21.0 Tab1fingertips.  The shock moment for me was when I realised I had messed up. I experienced a hardware failure just before starting the final phase you see and had to rebuild my laptop, suddenly my old data sources weren’t working, I took this as the time to take the plunge and move to PowerQuery – a mistake in my humble opinion although not for the reasons many of you would think – everyone wanting to refresh the report (it can be updated daily) has to have PowerQuery installed and enabled within Excel.

My second mistake I have trouble with, I used Excel 2013’s Timeline slicer. This is a controversial topic and one that I’ll try and give proper time to later.

 

 

A new dawn

A couple of years ago I began my journey into the world of Microsoft PowerPivot suite. Being honest as someone who got on fine with Excel, PowerPivot seemed a little sledgehammer meet nut. I looked for a number of things to use with it often times having to discount them because my audience just wasn’t ready. Then that all changed. Over the course of 2015 I was heavily involved in a major ITSM system upgrade, the required me to review all our current reporting and make sure that it was migrated to the new tool. Like almost every company I have ever worked for we couldn’t leverage the tools reporting engine immediately because of our requirements (we like to fiddle and not follow our own rules – doesn’t everyone?).

The net result I got the opportunity to implement what I had been planning for two years but had been able to get enough traction behind. We implemented a Power Query and PowerPivot based reporting solution. Over the course of a year I will be implementing a number of major improvements to the core report pack as well as learning more and more about the tool.

I hope you’ll join me on this exciting journey (I promise I’ll try and put some humour in somewhere).