Author Archives: David Teich

BI is Dead! Long Live BI!

In memory! Big Data! Data Analytics! The Death of BI!!!!

We in technology love the new idea, the new thing. Often that means we oversell the new. People love to talk about technology revolutions, but those are few and far between. What usually happens is evolution, a new take on an existing idea which can add new value to the same concept. The business intelligence space is no more immune to the revolution addiction that is any other area, but is it real?

Let’s begin to answer that by starting with the basic question: What is BI? All the varied definitions come down to the simple issue that BI is an attempt to better understand business through analyzing data. That’s not a precise definition but a general one. IT was doing business analysis before it was called that, it was either in individual systems or ad hoc through home grown software. Business intelligence became a term when MicroStrategy, Cognos, Business Objects and others began to create applications that could combine information across systems and provide a more consistent and consolidated view of that data.

Now we have a new generation of companies bringing more advance technologies to the fore to improve data access, analysis and presentation. Let’s take a look at a few of the new breed and understand if they’re evolution or revolution.

Some Examples

In-Memory Analysis

As one example, let’s discuss in-memory data analysis as headlined by QlikTech, Tableau and a few others. By loading all data to be analyzed into memory, performance is much faster. The lack of caching and other disk access means blazingly fast analytics. What does this mean?

The basic response is that more data can be massaged without appreciable performance degradation. We used to think nothing of getting weekly reports, now a five second delay from clicking a button to seeing a pie chart is considered excessive. However, is this a revolution? Chips are far faster than they were five years ago, not to mention twenty. However, while technologies have been layered to scale more junctions, transistors, etc, into smaller spaces, it’s the same theory and concept. The same is true of faster display of larger data volumes. It’s evolutionary.

Another advantage of the much faster performance is that you can do calculations that are much more complex, from simple pivot tables to being able to handle scatter charts with thousands of points is a change in complexity. However, it’s just another way of providing business information to business people in order to make decisions. It’s hard to find a way to describe that as revolutionary, yet some do.

Big Data

Has there been a generation of business computing that hasn’t thought there was big data? There’s a data corollary to Parkinson’s law which says that data expands to fill available disk space. The current view of big data is just the latest techniques that allow people to massage the latest volumes of data being created in our interconnected world.

Predictive Analytics

Another technical view of business is because software has historically been limited to describing what’s happened that business leaders weren’t using it for predictive analysis. Management has always been the combination of understanding how the business has performed and then predicting was would be needed for future market scenarios in order to plan appropriately. Yes, there are new technologies that are advancing the ability of software to provide more of the analytics to help with prediction, but it’s a sliding of the bar allowing software to do a bit more of what business managers had been doing through other means.

But there’s something new!

Those new technologies and techniques are all advancing the ability for business people to do their jobs faster and more accurately, but we can see how they are extensions of what’s happening. Nobody is saying “Well, I think I’ll dump my old dashboards and metrics and just use predictive analysis!” The successful new vendors are still providing everything described in “legacy BI systems.” It’s extension, not replacement – and that’s further proof of evolution in action.

Visio (ask your parents…) no longer exists, while Crystal Reports (again…), Cognos, Business Objects and others have been absorbed by larger companies to be used as a must-have in any business application. In the same way, none of the current generation of software companies is saying that the latest analytics techniques are sufficient. They all support and recreate the existing body of BI tools into their own dashboards and interfaces. The foundation of what BI means keeps expanding.

“Revolutionary”?

It’s really very simple, technology companies tend to be driven by technologists. Stunning, I know. Creating in-memory analytics was a great piece of technical work. It hadn’t been successfully done until this generation of applications. We knew were where getting more and more data accumulated from the internet and other sources, then some brilliant people thought about some algorithms and supporting technologies that let people analyze the data in much faster ways. People within the field understood that these technical breakthroughs were very innovative.

On the other side, there’s marketing. Everyone’s looking for the differentiator, the thing that makes your company stand out as different. Most marketing people don’t’ have the background to understand technology, so when they hear that their firm is the first, or one of the first, to think of a new way of doing things, they can’t really put it into context. New becomes unique becomes revolutionary. However, that’s dangerous.

Well, not always.

To understand that, we must refer to one of the giants in the field, Geoffrey Moore. In “Crossing the Chasm” and “Inside the Tornado,” Moore formally described something that many had felt, that there’s a major difference between early adopters and the mass market. There’s a chasm that a company must cross in order to change from addressing the needs of the former to the needs of the later.

The bleeding edge folks want to be revolutionary, that they’re visionaries. Telling them that they need to spend a lot of money and a lot of sleepless nights on buggy new software because it’s a step in the right direction doesn’t work quite as well as telling them it’s a revolutionary approach. As the founders like to think of themselves in the same place, it’s a natural fit. We’ve all joined the revolution!

There’s nothing wrong with that – if you are prepared for what it entails. There are people who are willing to get the latest software. Just make sure you understand that complexity and buggy young code doesn’t mean a revolutionary solution, just a new product.

Avoiding Revolution

Then there’s the mass market. To simplify Moore’s texts, business customers want to know what their neighbors are doing. IT management wants to know that their always understaffed, underfunded and overworked department isn’t going to be crushed under an unreasonable burden. Both sides want to know their investment brings a proper ROI in an appropriate time frame. They want results with as little pain as possible.

We’ve seen that the advances are in BI, not past BI. Yet so many analysts are crying the death of something that isn’t dying. How does that help the market or younger companies? It points people in the wrong way and slows adoption.

Not all companies have fallen for that. Take one of the current companies making the most headway with new technologies, Tableau Software. They clearly state, on their site, “Tableau is business intelligence software that allows anyone to connect to data in a few clicks, then visualize and create interactive, sharable dashboards with a few more.” They have not shied away from the label. They’re not trying to replace the label. They’re extending the footprint of BI to do more things in better ways. Or QlikTech’s self-description “QlikTech is the company behind QlikView, the leading Business Discovery platform that delivers user-driven business intelligence (BI).”

There are many companies out there who understand that businesses want to move forward as smoothly as possible and that evolution is a good thing. They’ll talk to you about how you can move forward to better understand your own business. They’ll show you how your IT staff can help your business customers while not becoming overwhelmed.

Know if you’re willing to be an early adopter or not, then question the vendors accordingly. As most of you are, purely by definition, in the mass market, move forward with the comfort that BI is not going anywhere and plenty of companies are merging the old and new in ways that will help you.

To paraphrase Mark Twain, the reports of BI’s death have been greatly exaggerated.

1010data at the BBBT: Cool technology without a clear strategy

The presenters at last Friday’s BBBT session were from 1010data. The company provides some complex and powerful number crunching abilities through a spreadsheet paradigm. As with many small technical companies, they have the problem of trying to differentiate between technology and a business solution.

Let’s start with the good side, the engine seems very cool. They use distributed technology in the Cloud to provide the ability to rapidly filter through very large data sets. It’s no surprise that their primary markets seem to be CPG and financial companies, as they are dealing with high volumes of daily data. It’s also no surprise because they must have very technical business users who are used to looking at data via spreadsheets.

The biggest problem is that spreadsheets are ok for looking at raw data, but not for understanding anything except a heavily filtered subset of it. That’s why the growth of BI has been in visualization. Everything 1010data showed us involved heavy work with filters, functions, XML and more. The few graphics they showed look twenty years out of data and quite primitive by modern standards. This is a tool for a power user.

Another issue, showing the secondary thought given to re-use and display of information is their oxymoronic QuickApps. As the spreadsheet analysis is done in the cloud on the live data set, if someone wants to reuse the information in reports a lot of work must be done. The technical presenter was constantly diving into complex functions and XML code. That’s not quick.

When asked about that, the repeated refrain was about how spreadsheets are everywhere. True, but the vast majority of Microsoft Excel™ use no functions or the very simplest of sum() and a few others. Only power users create major results and BI companies have grown over the move from Excel to better ways of displaying results.

I must question whether CEO and Co-founder Sandy Steier understands where the company fits into the BI landscape. He constantly referred to Cognos and MicroStrategy as if they’re the current technology leads in BI. Those solutions are good, but they are not the focus of conversation when talking about the latest in visualization or in-memory technologies. The presentation did have one slide that listed Tableau, their web site was devoid of references to the modern generation (or it was well hidden). Repeated questions about relationships with visualization vendors were turned off to other topics and not addressed.

Of key focus was an early statement by Mr. Steier that data discovery is self-service reporting. There seems to be the typical technical person’s confusion between technology and business needs. Data discovery is the ability to understand relationships between pieces of data to build information for decision making. Self-service reporting is just one way of telling people what you’ve discovered. Self-service business intelligence is a larger issue that includes components from both.

I very much liked the technology but I must question if the management of 1010data has the vision to figure out what they want to do with it. Two, of many possible, options show the need for that choice. First, they can decide to be a new database engine, providing a very powerful and fast data repository from which other vendors can access and display information. Second, they can focus on adding real visualization to help them move past the power users so that regular business users can directly leverage their benefits. The two strategic choices mean very different tactics necessary for implementation.

To summarize: I was very impressed with 1010data’s technology but am very concerned about their long term potential in the market.

ETL across the firewall: SnapLogic at the BBBT

SnapLogic presented at the BBBT last Friday. I was on the road then so I watched the video today. The presentation was by Darren Cunningham, VP Marketing, and Craig Stewart from product management. It was your basic dog and pony show with one critical difference for the BI space, they understand hybrid systems.

Most of the older BI vendors are still on-premises and tip-toeing into the Cloud. Most of the newer vendors are proudly Cloud. The issue with enterprises is that they are clearly in a strong, hybrid situation with a very mixed set of applications within and outside the firewall. Companies talk about supporting systems in a hybrid system, but you dig down and find out it’s one way or the other, with minimal thought given to supporting the other half.

Darren made it clear from the beginning that SnapLogic understands the importance of a truly hybrid environment. They are, ignoring all the fancy words, ETL for a hybrid world. They focus on accessing data equally well regardless of on which side of the firewall it resides. Their partner ecosystem includes Tableau, Birst and other BI vendors, while SnapLogic focuses on providing the information from disparate systems.

Their view was supported by a number of surveys they’d performed. While the questions listed had the typical tilt of company offered surveys, they still provided value. The key slant, that has implications for their strategic planning, is shown by one survey question on “Technical Requirements of a Cloud Integration Platform.” “Modern scalable architecture” came in first while “Ease of use for less technical users” was third.

As Claudia Imhoff accurately pointed out, the basic information might be useful, but it’s clear om their presentation that this was an IT focused survey and should be treated as such. It would be interesting to see the survey done similarly for both IT and business users to see the difference in priorities.

SnapLogic looks like they have a good strategy, the thing to watch is how they grow. The key founder is Gaurav Dhillon, one of the founders of Informatica. He had a good strategy but was replaced when the company grew to a point where he couldn’t figure out the tactics to get over the Chasm (full disclosure: I worked at Informatica in1999, when it hit the wall. I’m not unbiased). Let’s hope he learned his lesson. There’s a clear opportunity for SnapLogic’s software, and it seems to be going well so far, but we’ll need to watch how they execute.

TDWI: Evolving Data Warehouse Architectures in the Age of Big Data Analytics

Today’s TDWI webinar was a presentation by Philip Russom, Research Director, on “Evolving Data Warehouse Architectures in the Age of Big Data Analytics.” It was based on his just released research paper of the same name. Overall, it was a good overview of where their market survey shows the market, but I have a nit.

One of the best points Mr. Russom made was during his discussion of responses to “What technical issues or practices are driving change in your DW architecture. The top response, with 57%, was Advanced Analytics. Phillip pointed out to everyone that’s about as good a proof as you get in business driving technology. Advanced analytics is a business need and that pushing the technical decision. Too many people get too wrapped up in technology to realize it exists to solve problems.

TDWI what is data warehouse architectureAnother key point was made as to the evolving nature of the definition of data warehousing. Twenty years ago, it was about creating the repository for combining and accessing the data. That is now definition number three. The top two responses show a higher level business process and strategy in place than “just get it!”Where I have a problem with the presentation is when Mr. Russom stated that analytics are different than reporting. That’s a technical view and not a business one. His talk contained the reality that first we had to get the data, now we can move on to more in depth analysis, but he still thinks they’re very different. It’s as if there’s a wall between basic “what’s the data” and “finding out new things,” concepts he said don’t overlap. Let’s look at the current state of BI. A “report” might start with a standard layout of sales by territory. However, the Sales EVP might wish to wander the data, drilling down and slicing & dicing to understand things better by industry in territory, cities within and other metrics across territories. That combines what he defines as separate reporting and data discovery.

Certainly, the basic row-based reporting can switch to columnar structures for better analytics, but that’s techie. The business user sees a simple continuum. Like most other areas where technology advances (pretty much all of life…), business software solves a problem and then allows people to see past that one to the next. That successor problem doesn’t have to be completely different, and in the case of reporting and analytics, it’s not.

The final takeaway I received from his webinar helped support that concept even if his earlier words didn’t. He talked about the Multi-Platform Data Warehouse Environment, the fact that DW’s aren’t going anywhere, they’re only being incorporated into a wider ecosystem of data technologies in order to continue to improve the understanding and decision making capabilities of business managers.

Other than the disagreement I have with his view of reporting and analytics, I heard a good presentation and suggest people check out the full report.

Qlik at BBBT. QlikView Expressor: Come for the ETL, stay for the Data Governance

Last Friday’s BBBT presentation was by Qlik and the primary purpose was to discuss QlikView Expressor, but that was just a foundation for what really caught my eye, a great advance in data governance.

Qlik bought Expressor Software last June and the presentation was the chance to show the re-branded products to the analysts in the group. Expressor bring baby ETL to Qlik. The presenters, Donald Farmer and Bill Kehoe were very honest and clear that the product is intended for those who start with basic self-service BI and find they need to get to multiple sources as they begin to expand their use. I’ll be clear, this is a baby product. Their public case study was, according to the slides, using ODBC and Salesforce.com’s very open API. How they can, and even if they should, handle access to more complex and proprietary systems remains a big question.

As Informatica and other major players in the ETL space have strong partnerships with Qlik, it’s a careful game Qlik has to play. On one side, they have to provide some basic ETL functionality to a key portion of their market, on the other side they have to not alienate the big players. Often products acquired in such a middle ground either fail from the lack of a clear solution or cause problems with partnerships, but only time will tell how Qlik will handle this. For the time being, I don’t see this product being a threat to their partners.

The presenters waffled the early message about ETL being a way to govern access to data and why became very clear as the presentation entered the second section. QlikView Expressor is being used as a component driving Qlik’s new QlikView Data Governance Dashboard. The company has done an amazing job at blending ETL, their existing and well known BI presentation software, and a smart overview of the full architecture to take a very good step forward in helping companies understand where their data is being used.

As Donald Farmer pointed out, only half humorously, “Microsoft Office for the iPad has killed data governance.” KPIs defined in multiple departments, different reports on different computers and the growth of laptops made data governance difficult in the previous decades. The boom in tablet use has expanded that challenge exponentially. Having the leading business productivity suite now available on the leading tablets means company reports, spreadsheets and more are spread even further through the business ecosystem. Data governance becomes vastly more difficult to achieve.

The Data Governance Dashboard is a first step in helping IT and business users understand what information is out there, where it’s residing and how much of it is used how often.

This isn’t a blog about the features, but one must be mentioned because it is critical. Knowing what data fields are being accessed by what BI reports is important by itself, and Qlik and others have been looking at that. The extension that matters is their reports that begin to link the report labels users see to the internal fields. Think about the ability to see that two divisions both use the label “Gross Profit” and then understand that they’re using different fields, definitions and underlying data to create the displayed numbers.

Self-service BI is reaching the point where desktop computers were in the early 1990s. The 1980s saw business users run away from what they saw as a controlling IT infrastructure. It helped and it created confusion. IT and business had to find new ways to work together, reintegrate and help both sides help the company. The Governance Dashboard is something that can help lead the BI industry into that same step to help both sides provide improved and consistent information to business decision makers. Well done.

TDWI Webinar – Preparing Data for Analytics with Liaison Technologies

Tuesday’s TDWI Webinar was titled “Preparing Data for Analytics,” but that’s not what it was about. It was still interesting, just misnamed. The focus was, as would be expected with Liaison being the sponsor, about how managing data in the Cloud can enhance the ability of some companies to support BI.

It started with Phillip Russom, an in-house TDWI analyst talking a bit about preparing data without ever using the words extraction and transformation. The most interesting point he made was an aside in the main presentation, but one that should be brought to the fore: What’s SMB?

Most of us think of SMB as Small-to-Medium sized Businesses. His point was that it’s really Small-to-Medium sized Budgets. Many folks involved in enterprise software understand that you often don’t sell an enterprise license up front. A software vendor will sell to a department or a business division with a smaller budget that needs to get something done. Then the tactic is to expand within the enterprise to build it into a major account. Mr. Russom makes the great point that the tactics to sell into the smaller groups in an enterprise are very similar to those uses to sell to smaller businesses, so maybe there are opportunities being left on the table by smaller vendors.

His other key point needs a webinar of its own. He mentioned that companies looking for Cloud analytics should “make sure Cloud solutions are optimized for data and not for applications.” It’s the data that’s important, how you get it and how you prepare it. That’s what has to be supported first, then applications can access the data. Sadly, he said that and moved on without any real details on what that preparation means. I’d like to see more details.

The main speaker was Alice Westerfield, Sr. VP of Enterprise Sales at Liason Technologies. Her main point followed Russom’s lead in by pushing that a good analytics platform requires moving from an application centric approach to a data centric one. No surprise, Liaison has one handy. Most importantly, it’s a Cloud approach, one they’ve been offering before Cloud became the buzzword.

Alice was brief but focused on their history of supporting integration in the Cloud. The four main benefits she mentioned were:

  • Data Integration
  • Data Transformation
  • Data Management
  • Data Security

That makes sense and we all know how important the last one, security, matters before people are willing to perform the first three on the Cloud. However, it’s the changing nature of the data game that means I want to focus on the first, data integration.

While Liaison talks about the benefits of leveraging their years of integration skills rather than reinventing or needing to take new integrations and install them in an on-premises solution, there’s another Cloud aspect that I think is critical. Most businesses use a mix of applications and many are already on the Cloud. Add to that the mobile nature of today’s generation of BI solutions which are provided in the Cloud. It makes sense for many SMBs to leverage that. Why take data from Cloud apps, move them on-premises and then move them back to the Cloud? Using a service such as Liaison simplifies and speeds the process of meshing data from within and outside of the firewall and then providing wide access to knowledge workers through the new BI interfaces.

For the foreseeable future, there will continue to be reasons for keeping data within the firewall, but for most data and most companies, a solution such as Liaison’s would seem to be an opportunity to quickly integrate data and share it as broadly as required.

The Webinar That Couldn’t: Modernizing the Traditional BI Environment

Years ago, I was a programmer. Then folks started calling themselves software engineers. Of course, programming isn’t engineering. The best programmers combine science, engineering and art as do all good craftsmen, but words such as engineer and scientist hold an allure (and the potential for more money) to some people who aren’t as confident as they should be, hence such modern titles as Sanitation Engineer.

Why does that line of thought come to mind? I spent an hour yesterday listening to a TDWI webinar where one focus was on the Data Scientist when it should have been on business intelligence. The presenters were independent analyst Colin White and IBM Product Marketing Manager Brent Winsor. The typical analyst & company presenter webinar covers a key theory impacting a market and then how a specific product addresses that issue or need. Not so on Thursday.

My primary issue is the artificial split in the claim that BI concerns itself with descriptive and diagnostic analysis while data science looks towards predictive and prescriptive analysis. The main point of many of the new analytics for data mining, created by people calling themselves data scientists, is to understand patterns in large data sets. That’s clearly descriptive in nature.

Another reason that the line is artificial is a key and valid point Colin White stated as “much of predictive analytics is looking at historical data.” Numerous vendors are beginning to provide predictive analytics by starting with a base of understanding past performance and providing analytics to do what-if planning by seeing what happens as incoming data parameters are modified to simulate different conditions. If Mr. White had spent more time discussing how this evolutionary change is happening rather than inventing stalking horses, it would have been a much better presentation.

Brent Winsor’s presentation had other problems. On its own, it became rapidly clear that Mr. Winsor was reading a script rather than talking with us. That could be what happens when a marketing organization in a large company decides that, rather than helping a presenter, too much must be controlled and the “help” turns into an impediment. From what I could glean in the Q&A, Brent is sharp, knows his stuff, and can relate to the audience; but he was stifled throughout the formal presentation.

The bigger issue, though, was the Brent Winsor focused on self-service BI. While that’s of interest, I pointed out at the beginning how the consultant and company presentations should mesh. One being about data science and the other about self-service BI was not a flow that leads to a powerful overall webinar.

The only real nit I have with his presentation on its own is the issue of self-service. That is empowering the business user to create and manipulate reports and charts to wander the data without requiring IT or others to created new output for them. It makes sense and something that IBM Cognos and its competitors are enabling in the market place. However, just because self-service is important doesn’t mean everyone using the tool is interested in or availing themselves of self-service.

One view of that was Brent’s discussion of the CxO suite, folks who often don’t want to do anything, or much, to the information, they just need the high level overviews provided by their managers and business analysts so they can synthesize their own conclusions. Many only need to glance at a single drill down or other reporting capabilities in the existing, traditional tools. However, in an attempt to expand the realm of self-service to every user, Mr. Winsor and IBM have created the category of “managed/scripted” self service. If your access is being managed or if you are following data flow scripts being created by others, you aren’t self-service. We don’t have to put everything in the latest marketing basket. Brent’s points about the spectrum of BI users was dead on, I only think it was a stretch to make all users self-service users.

Not every webinar can be perfect, nor do I claim perfection for my own; but the webinar was so far from it that I came away very disappointed.

TDWI Webinar – BI in the Cloud

Today, TDWI held a webinar on BI in the Cloud. The simple summation: It’s slowly gaining a foothold, but it’s early.

The presentation was a tag team between Fern Halper, Research Director, Advance Analytics, and Suzanne Hoffman, Sr. Director, Analyst Relations at Tableau Software, the webinar’s sponsor. Fern Halper’s focus, along with plugging her books, was that she sees people beginning to turn the corner in understanding and using the Cloud for BI. A number of indicators of that were slides of survey results from the recent TDWI conference. One slide pointed to results shown 25% of attendees rejecting the public Cloud, at least in the short term and 36% don’t yet know. That means only 39% are either already using it or planning on using it. It’s growing, but not yet the norm.

Another key aspect of her talk was on a subject many people don’t consider until it’s too late. While people looking at the Cloud focus on the risks of getting on the cloud, such as security and compliance issues, there are issues related to a key concern that exists regardless of where your data resides: Vendor choice.

Your costs have gone up more than expected. Another vendor comes along with features you really need. What do you do? It’s hard enough to migrate between applications that you manage on premises. When your data is hosted by others, what is the access scenario? How will you get your data from one system to another? Decision makers should be planning exit strategies as part of the purchase decision.

Suzanne Hoffman’s segment covered, at a much higher and briefer level, the content of the BBBT presentation I’ve previously described. Due to that briefness and a question from one attendee, I learned something I missed the other day. Tableau Online is the server only. Anyone accessing it must have at least one desktop version to work with it to set up database relations. In this era, where more and more companies are recreating their interfaces using HTML5 in order to provide full functionality anywhere, it’s interesting that a company often described as a disruptive technology is lagging in this respect. This isn’t a problem in the short term, as delivery on multiple devices is still provided and it’s not much of a problem setting up the links on a desktop, but it’s something to watch.

Companies are tiptoeing to the Cloud, and Fern Helper’s presentation shows us the momentum building. We haven’t reached an inflection point, and I’d be surprised if we see one in the next 12-18 months, but it’s good to see TDWI keeping an eye on things and giving us sign posts on the way.

Tableau Software at the BBBT

Years ago, I had a great boss. Very smart, good people manager, understood strategy and tactics. There was one flaw. One joke was that head never use a sentence if a paragraph would suffice. That showed most clearly is the voluminous number of slides that would be part of any presentation.

He comes to mind because I attended Tableau Software’s presentation to the Boulder BI Brain Trust (BBBT). 93 slides. Yes, 93. Even if folks were not familiar with BI, I think the presentation would have been a bit, shall we say, extensive. Just like that boss, the presenters had a lot of very interesting to say, but finding it was often hidden in too much explanation. So I don’t make the same mistake while describing the presentation, here are my key takeaways:

  • Balanced on-premises and Cloud strategy.
  • Strong forethought about changing relationship between IT & business users.

Everyone even vaguely involved in BI should already be familiar with the basics of Tableau. It is one of the leaders in the latest generation of BI solution providers. They are known for their visualization skills yet haven’t scrimped at the technology needed to manage large data sets. They had a tag team of four presenters for the presentation. The first presenter was Suzanne Hoffman, Sr. Director of Analyst Relations, who gave the overview of company growth since last year’s BBBT presentation.

Francois Ajenstat, Sr. Director, Product Management, then followed with, as would be expected from product management, a review of products including a discussion of v8.1. In a review of a number of things, the one item that caught my attention was the extension of their Cloud packages. It was clearly stated that the product wasn’t redesigned for the Cloud, just moved outside the firewall to a hosted environment, but that’s a fine first step considering the rest of the work they’ve done. What’s critical is they’ve thought through the Cloud business model and how that is different than what’s needed for purchased software.

The one issue I have, from an analyst perspective, was the very brief discussion of the supposed success and early adoption of their 8.1 offering. While describing the number of companies who have rapidly adopted it, there was no breakdown. How many were new customers versus old? Were they larger or smaller companies? While it is to be expected that newer companies and departments, as well as smaller companies, would be the first to adopt, and while that’s not necessarily an issue, clarity would have helped.

In a look ahead about v8.2, which we were told is not under NDA was a look at what seems to be an extremely professional Mac port. It’s not just moving what was on MS Windows to the Mac, as many companies do. It looks like a product designed for the Mac that will make very happy any user of that platform. While Francois did a good job explaining why they wanted to port to a platform that has 10% of the business market, I think Tableau should be more explicit on who that 10% is and why that matters. Marketing. Yes, I know, I’m biased towards marketing, but think about the fact that marketing was the home of the Mac during the lean years. Why? Because the Mac has always been better for visualization, so marketing continued to use the platform for creating ads, graphics, charts and more. Providing a great Mac port enhances the ability of marketing to serve good analytics and presentations to the rest of the firm.

The presentation was then handed to Robert Kosara, a research scientist, to talk about visual storytelling. Here, again, there were too many examples and a blurred message. While trying to describe how the concept was different than the visualization theories that have happened from Tufte onwards, the difference wasn’t made very clear. There seemed to be two main takeaways. First, they’re only describing the fact that modern BI allows isolation of and drill down into segments of the visualization, things that aren’t possible in old-line static print medium. That seems intuitively obvious rather than a giant leap forward. The second part of it isn’t with theory but with some of Tableau’s plans (again, we were told that this level isn’t under NDA) to provide technology that merges visualization with presentation layout, where the application will be able to take the equivalent of slides of the visualization to increase the presentability of analysis. That’s still early in work but could be very interesting as it matures.

The final presenter was Ellie Fields, VP of Product Marketing. Maybe I’m biased, as my background is in product marketing, but I wish this section had not been last. The most important thing is her presentation was what I think best brings out why Tableau is a leader. Her section focused on the changing environment for BI implementations. Everyone in the industry acknowledges the power of the BI front-end is giving business users more and more ability to be involved in creating and managing their own analytics. We know that IT is going to have to change from a more controlling focus to one of service. Ellie Fields has thought this through and clearly defined how the change doesn’t obviate the need for IT but only changes the relationship between the two sides. The most important slide in the deck is the one below, where a high level description of that partnership is presented.

Tableau IT and Biz partnership

The past growth of Tableau Software has clear reasons. It was a good technology introduced to the market at the right time. What they’ve done so far is a sign of management’s ability to understand an early market. Continued growth is across Moore’s Chasm and a longer term strategy is critical. In my (probably not so humble) opinion, while technology is critical there are far too many technology companies that failed with great products. It’s the vision shown in both Francois Ajenstat’s discussion of managing both on-premises and Cloud markets combined with Ellie Field’s understanding of how to position the company to help businesses through a major change in how IT and business users will relate that show the strength of the company moving forward.

I just wish we heard that information bubbled and  more visible rather than lost in a bunch of other words that weren’t really needed.

Kalido: A solution in search of the message?

I had the fortune to see Kalido presentations twice in two days. First, was the Qlik road show event on Thursday and second was the Boulder BI Brain Trust call on Friday.

Kalido provides a streamlined way to create and manage data warehouses. The key seems to be a strong data modeler linked to the engine, providing a more graphical way to define processes and link that to the automatic creation of the physical data layers and data warehousing management. According to their case studies, the result is a significant savings in time to deploy and manage warehouses. As they pointed out in the BBBT presentation, the deployment savings is a clear and compelling argument but the longer term saving in ongoing operational costs is one they haven’t yet successfully attacked.

That ties in to the issue of their major message to the Qlik audience and on their web site: “No ETL!” As anyone who understands their technology knows, and as they pointed out in their BBBT presentation, ETL is one component of their solution. The presenter on Thursday tried to claim it’s not ETL, it’s ELT, because they use a temporary data store to more quickly extract information from operational systems, but that’s not going to cut it. ETL is still performed even if in a slightly different order. IT people will understand that and laugh at the claim while most BI business users won’t know what that means and the rest won’t care as it’s not a major concern to those people trying to get information out of the warehouse.

Operational costs matter to both IT and the business line managers. As many IT centers internally “bill” divisions for costs, that will still have an impact and matter to both sides more than a specious ETL message.

More importantly, the ability to change your business model and have it rapidly be reflected in the data warehouse is of strong value to the decision makers. The ability to eliminate 6-9 months of rework before a change is done only to see the changes now be out of date has a clear and compelling message for business decision makers. The ability to rapidly satisfy business users in changing markets while using less IT resources is valuable to the IT organization.

So why does the message seem to be missing a great market focus opportunity? One possible answer is found on their executive team page. Rather, it’s what’s not found. A company that wants to leave the startup phase and address a wider market would do well to emphasize marketing with the same importance as engineering and professional services. Products aren’t enough, you have to create messages that address what interest stakeholders. Kalido seems to have a very good product, but they aren’t yet able to create messages to address the wider market.