A Blank Slate

I recently read a blog post and two papers by Barry Devlin in which he considers whether it’s time for some major evolution in data warehouse  architecture. His point is that our “venerable” data warehouse architecture has evolved over decades with a set of assumptions about database hardware and software. In a recent blog post on the BeyeNetwork site entitled Data Warehouses and Solid State Disks (SSD) , Devlin says

“Over the past couple of years, we’ve seen dramatic improvements in database performance due to hardware and software advances such as in-memory databases, columnar storage, massively parallel processing, compression, and so on …”.

He goes on to question several of the old assumptions about what is possible in a single database.  The most interesting one for me is, as he puts it:

“Do you still need that Data Mart?  With so much faster performance, maybe the queries you now run in the Mart could run directly on the EDW.  Reducing data duplication has enormous benefits, on storage volumes, but principally in reducing maintenance of ETL to the Marts.”

I have no experience with the new technologies that he cites. But I have lots of experience with today’s established DW architectures. If an investment in new technology could significantly reduce the amount and complexity of data warehouse processing, then it’s worth serious consideration.

I’m just beginning to draft a road map for a real BI/DW program for my new employer. I’m ready to lead the BI/DW team in a charge down well-trodden paths of design and implementation. I’m formulating resource plans for building out the team. This is what they hired me to do. They’re paying for my experience in doing these things. Now I’m wondering if at least to some extent, I would be doing a disservice to the company by employing that experience.

This is exactly the kind situation that got me interested in documenting my experiences in this job – a blank slate is both compelling and intimidating. I don’t have a lot of answers at the moment. But I’m liking the questions.


Visualize Fishing

Just a quick note: If you want to see an interesting example of viral marketing, check out Tableau Public. It’s an ingenious approach to spreading the word about Tableau’s data visualization and analysis products. There seems to be a fair amount of buzz about it in the Twitterverse.

Take a look at this dashboard I made based on summer salmon catch records in central Puget Sound. It tells you everything you need to know about when and where to fish for the most desirable species of salmon. Tableau is cool.

Salmon Dashboard

A POC On Both Your Houses

Back to the Tableau conundrum. As I mentioned in an earlier post, one of the first things I was tasked with was to fast-track the Implementation of Tableau Server — not evaluate self-serve reporting tools in general or even whether Tableau is a good choice for our company. In fact the deal was to just get the server up and running and hand it over to the business. They are so desperate for reporting that they volunteered to administer the server themselves because they were told IT has no bandwidth.

Okay then. Which battle to pick? I could make a good case for doing nothing and telling everyone to wait for me to get up to speed. It wouldn’t go over well and there would be a lot of pressure to make a commitment to an alternative solution, which I’m not really prepared to do. But I think I could win the argument. No doubt some will say this is what I should do. Slow things down. Understand requirements. Make a deliberate, informed choice.

Well I didn’t do that. I also didn’t completely give in and let the user community go it alone. Instead, I argued that the BI team has to own the company’s reporting tools. So we are going to administer the server. Also, instead of making an immediate purchase for as many users as possible (so we can get a volume discount on the licenses, also part of the original “plan”), we are going to run in proof-of-concept mode for a while using Tableau’s free trial with a limited number of users.

The trial period will show us what it’s really like to run the product, in particular, how useful the Viewer license is compared to the Interactor license. There’s a big difference in price — $200 versus $1000. The Viewer license does not allow you to do the very cool visual slicing and dicing that makes Tableau such a sexy product. It only presents data in a  static form. For the fancy stuff, you have to pay $1000 a seat (before volume discounts).  The Viewer license may not be worth purchasing. We’ll see. I’ll report back on this issue in a couple of weeks when the POC is wrapping  up.

By the way, this experience has reminded me that I’m mad at BI product vendors, especially the “new paradigm” guys like Tableau and QlikView. They make great looking products and market directly to my internal business customers telling them that IT is an obstacle and that if only IT would give them the right tools and get out of the way, they would realize all their analytical dreams. I take being called an obstacle personally.

Still Fogged In But Conditions Improving

I’m trying to find a balance between dealing with the work that was already in progress when I arrived and moving the focus from being reactive to acting according to a strategy. That means not getting sucked too far into the day-to-day. That’s a tough thing for me since I feel more comfortable when I’m in control of operational details. But I’m going to have to let that go, at least to some extent.

I need to concentrate on defining and selling the BI/DW strategy. I’ve been working on that this week. The approach is to define the current state of BI/DW capability and need, define what the future state should look like, and then describe how we get from where we are to where we want to be. Sounds simple enough but there’s lots of work in sussing out the details for each the three broad categories. I’m building a proposal / presentation around describing the current state in terms of the TDWI BI maturity model and then describing what greater maturity looks like and what the benefits to the organization are. Finally, I’ll lay out a road map for a robust BI program that drives us to the right-hand side of the maturity model.

For communicating the basic shape of a BI road map, I like the Kimball framework. It shows how BI programs grow by advancing on three tracks simultaneously: Architecture/Technical Infrastructure, Dimensional Model, and End-user tools. This is a framework that I’ve successfully used in the past for defining BI program strategies. However, there is a very important class of activity that this framework does not address. I think it falls under the heading of organizational change management.

Moving through the levels of BI maturity always entails some degree of cultural change, sometimes a great deal of cultural change. For example how do you to get the independent reporting groups that have created their own data marts (spreadmarts) to either give up their responsibilities to a centralized BI team or become part of an extended virtual BI team that operates according to a common set of best practices?

There is an even more fundamental cultural shift that some organizations have to go through: getting people to be fact-based in their decision-making. Many times I’ve asked an executive or manager “what questions do you need to answer about your business (or the process) to make decisions on how to manage  it?” only to be met with a blank stare or a Kramden-esque hamana-hamana. That’s when you know there isn’t much of an internal market for BI services. In that case you have to make the market by helping drive the cultural change from gut-feel management to data-driven management.

We’ll see how it goes at this company.

Week 1 on the job

Most of this week I spent learning the lay of the BI land. There are many departmental reporting groups that have their own data silos. I knew this going in, so no surprises. I need to gather information about them – put together an inventory. Part of the program strategy will be to bring some of the groups under the umbrella of a competency center-like structure while taking over the work of the others within the BI team proper. An inventory of subject areas, tools, strengths and weaknesses of the departmental reporting groups will be critical to making those decisions.

Right off the bat, I’m faced with a tough decision about a BI tool. One reporting group has several Tableau client licenses and wants get Tableau server. Being new, I have no idea why Tableau was chosen or how it’s being used. I don’t know what need they think having a server license will satisfy. It seems the decision to allow this to happen was made before I arrived. I’ve been asked to meet with the users and then facilitate the acquisition and implementation of the server.

The scariest thing about this is that they’re using Tableau to query against their core transactional systems, not against a data warehouse. Yikes. Perhaps this particular user group is savvy enough about the data that it’s less of a risk than it seems on the surface. I just don’t know yet.

I’ll meet with the users next week and then I’ll have to decide whether I need to push back on this plan. The tough thing is that if I feel I have to try to stop this initiative, I’ll have to do it without having an alternative solution to offer. This genie just may need to let out its bottle for the short-term.