Category Archives: Business Intelligence

VisualCue at BBBT: A New Paradigm for Operational Intelligence

The latest presentation to the BBBT was by Kerry Gilger, President and Founder of VisualCue™ Technologies. While I find most of the presentations interesting, this was real eye-opener.

Let’s start with a definition of operational intelligence  (OI): Tools and procedures to better understand ongoing business operations. It is a subset of BI focused on ongoing operations in manufacturing, call centers, logistics and other physical operations where the goal is not just to understand the high level success of processes but to better understand, track and correct individual instantiations of the process.

A spreadsheet with a row of data for each instantiation is a cumbersome way to quickly scan for the status of individual issues. The following image is an example of VisualCue’s solution: A mosaic of tiles that show different KPIs of the call center process, with a tile per operator, color coded for quick understanding of the KPIs.

VisualCue call center mosaic

 The KPIs include items such as call times, number of calls and sales. The team understands each element of the tile and a review shows the status of each operator. Management can quickly drill down into a tile to see specifics and take corrective actions.

The mosaic is a quick way to review all the instantiations of a given process, a new and exciting visualization method in the industry. However, they are a startup and there are issues to watch as they grow.

They have worked closely with each customer to create tiles that meet needs. They are working to make it easier to replicate industry knowledge to help new customers start faster and less expensively.

The product has also moved from full on-site code to a SaaS model to provide shared infrastructure, knowledge and more in the Cloud.

VisualCue understands operational intelligence is part of the BI space, and has begun to work with standard BI vendors to provide integration with other elements that make up a robust dashboard including the mosaic and other informational elements, that’s rightfully in its infancy given the company’s evolutionary stage. If they keep building and expanding the relationships there’s no problem.

However, the thing that must change to make it a full-blown system is really how they access the data. It’s understandable that a startup expects a customer to figure out all its own data access issues and provide a single source database to drive the mosaics, they’re going to have to work more closely with ETL and other vendors to provide a more open access methodology as they grow and a more dynamic, open data definition and access model than “give us a lump of data and we’ll work with it.”

Given where the company is right now, those caveats are more foibles than anything else. They have the time to build out the system and their time has, correctly, been spent in creating the robust visualization paradigm they demonstrated.

If Kerry Gilger and the rest of his team are able to execute the vision he’s shown, VisualCue will add a major advancement in the ability for business management to quickly understand operations in a way that can provide instant feedback that can improve performance.

TDWI Webinar on in-memory data, another miss

I’ve been skipping the last few TDWI webinars, not exactly knowing how to politely criticize some poor ones. However, I feel as I’m performing a disservice to those who read, so I’ll have to discuss today’s.

The title was “In-Memory Computing: Expanding the Platform Horizon Beyond the Database.” The pitch was that in-memory is so good for databases we should think about doing everything from ETL to everything else in the information chain there. One word critique: Oy.

In-memory analytics has been great for very fast processing. Having the data resident in memory is obviously a great way of providing rapid response for users of reports and analytic tools. However, it’s no panacea.

Simply put, two demands are limiting the cost effectiveness and even ability to do in-memory analytics: The amount of data and the number of users.

One of the repeated refrain of in-memory proponents is “memory is cheap!” Yes it is. However, massively parallel servers with the ability to efficiently link multiple cpus to large amounts of memory while providing coherence for multiple users aren’t. They quickly get very expensive with the costs of high end machines being much more on a pure memory amount level than commodity servers. There’s also an upper bound and with much of the larger data analytics today, multiple servers will be needed.

The other issue is the growth of self-service BI and mobile access to reports means that more memory is needed for non-database usage. A number of in-memory solution providers tell you that each user takes space in memory to satisfy individual needs. The more users the more space is taken from database availability.

The growth of server farms, being created in the Cloud now, is how the blend of in-memory versus space requirements will be addressed. “Fast enough” matters more than millisecond response time. With what we are constantly learning about both data manipulation and presentation, the strongest Cloud providers will win by keeping the most used information in-memory and sharing the rest among caches on multiple servers.

In-memory isn’t new and needs are much different than when it was. Listen to people talk about it and pay attention: If it’s the only thing discussed, they’re not being honest; if it’s a core part of the solution with its caveats addressed, the vendor, analyst or pundit is helping you.

BI is Dead! Long Live BI!

In memory! Big Data! Data Analytics! The Death of BI!!!!

We in technology love the new idea, the new thing. Often that means we oversell the new. People love to talk about technology revolutions, but those are few and far between. What usually happens is evolution, a new take on an existing idea which can add new value to the same concept. The business intelligence space is no more immune to the revolution addiction that is any other area, but is it real?

Let’s begin to answer that by starting with the basic question: What is BI? All the varied definitions come down to the simple issue that BI is an attempt to better understand business through analyzing data. That’s not a precise definition but a general one. IT was doing business analysis before it was called that, it was either in individual systems or ad hoc through home grown software. Business intelligence became a term when MicroStrategy, Cognos, Business Objects and others began to create applications that could combine information across systems and provide a more consistent and consolidated view of that data.

Now we have a new generation of companies bringing more advance technologies to the fore to improve data access, analysis and presentation. Let’s take a look at a few of the new breed and understand if they’re evolution or revolution.

Some Examples

In-Memory Analysis

As one example, let’s discuss in-memory data analysis as headlined by QlikTech, Tableau and a few others. By loading all data to be analyzed into memory, performance is much faster. The lack of caching and other disk access means blazingly fast analytics. What does this mean?

The basic response is that more data can be massaged without appreciable performance degradation. We used to think nothing of getting weekly reports, now a five second delay from clicking a button to seeing a pie chart is considered excessive. However, is this a revolution? Chips are far faster than they were five years ago, not to mention twenty. However, while technologies have been layered to scale more junctions, transistors, etc, into smaller spaces, it’s the same theory and concept. The same is true of faster display of larger data volumes. It’s evolutionary.

Another advantage of the much faster performance is that you can do calculations that are much more complex, from simple pivot tables to being able to handle scatter charts with thousands of points is a change in complexity. However, it’s just another way of providing business information to business people in order to make decisions. It’s hard to find a way to describe that as revolutionary, yet some do.

Big Data

Has there been a generation of business computing that hasn’t thought there was big data? There’s a data corollary to Parkinson’s law which says that data expands to fill available disk space. The current view of big data is just the latest techniques that allow people to massage the latest volumes of data being created in our interconnected world.

Predictive Analytics

Another technical view of business is because software has historically been limited to describing what’s happened that business leaders weren’t using it for predictive analysis. Management has always been the combination of understanding how the business has performed and then predicting was would be needed for future market scenarios in order to plan appropriately. Yes, there are new technologies that are advancing the ability of software to provide more of the analytics to help with prediction, but it’s a sliding of the bar allowing software to do a bit more of what business managers had been doing through other means.

But there’s something new!

Those new technologies and techniques are all advancing the ability for business people to do their jobs faster and more accurately, but we can see how they are extensions of what’s happening. Nobody is saying “Well, I think I’ll dump my old dashboards and metrics and just use predictive analysis!” The successful new vendors are still providing everything described in “legacy BI systems.” It’s extension, not replacement – and that’s further proof of evolution in action.

Visio (ask your parents…) no longer exists, while Crystal Reports (again…), Cognos, Business Objects and others have been absorbed by larger companies to be used as a must-have in any business application. In the same way, none of the current generation of software companies is saying that the latest analytics techniques are sufficient. They all support and recreate the existing body of BI tools into their own dashboards and interfaces. The foundation of what BI means keeps expanding.

“Revolutionary”?

It’s really very simple, technology companies tend to be driven by technologists. Stunning, I know. Creating in-memory analytics was a great piece of technical work. It hadn’t been successfully done until this generation of applications. We knew were where getting more and more data accumulated from the internet and other sources, then some brilliant people thought about some algorithms and supporting technologies that let people analyze the data in much faster ways. People within the field understood that these technical breakthroughs were very innovative.

On the other side, there’s marketing. Everyone’s looking for the differentiator, the thing that makes your company stand out as different. Most marketing people don’t’ have the background to understand technology, so when they hear that their firm is the first, or one of the first, to think of a new way of doing things, they can’t really put it into context. New becomes unique becomes revolutionary. However, that’s dangerous.

Well, not always.

To understand that, we must refer to one of the giants in the field, Geoffrey Moore. In “Crossing the Chasm” and “Inside the Tornado,” Moore formally described something that many had felt, that there’s a major difference between early adopters and the mass market. There’s a chasm that a company must cross in order to change from addressing the needs of the former to the needs of the later.

The bleeding edge folks want to be revolutionary, that they’re visionaries. Telling them that they need to spend a lot of money and a lot of sleepless nights on buggy new software because it’s a step in the right direction doesn’t work quite as well as telling them it’s a revolutionary approach. As the founders like to think of themselves in the same place, it’s a natural fit. We’ve all joined the revolution!

There’s nothing wrong with that – if you are prepared for what it entails. There are people who are willing to get the latest software. Just make sure you understand that complexity and buggy young code doesn’t mean a revolutionary solution, just a new product.

Avoiding Revolution

Then there’s the mass market. To simplify Moore’s texts, business customers want to know what their neighbors are doing. IT management wants to know that their always understaffed, underfunded and overworked department isn’t going to be crushed under an unreasonable burden. Both sides want to know their investment brings a proper ROI in an appropriate time frame. They want results with as little pain as possible.

We’ve seen that the advances are in BI, not past BI. Yet so many analysts are crying the death of something that isn’t dying. How does that help the market or younger companies? It points people in the wrong way and slows adoption.

Not all companies have fallen for that. Take one of the current companies making the most headway with new technologies, Tableau Software. They clearly state, on their site, “Tableau is business intelligence software that allows anyone to connect to data in a few clicks, then visualize and create interactive, sharable dashboards with a few more.” They have not shied away from the label. They’re not trying to replace the label. They’re extending the footprint of BI to do more things in better ways. Or QlikTech’s self-description “QlikTech is the company behind QlikView, the leading Business Discovery platform that delivers user-driven business intelligence (BI).”

There are many companies out there who understand that businesses want to move forward as smoothly as possible and that evolution is a good thing. They’ll talk to you about how you can move forward to better understand your own business. They’ll show you how your IT staff can help your business customers while not becoming overwhelmed.

Know if you’re willing to be an early adopter or not, then question the vendors accordingly. As most of you are, purely by definition, in the mass market, move forward with the comfort that BI is not going anywhere and plenty of companies are merging the old and new in ways that will help you.

To paraphrase Mark Twain, the reports of BI’s death have been greatly exaggerated.

1010data at the BBBT: Cool technology without a clear strategy

The presenters at last Friday’s BBBT session were from 1010data. The company provides some complex and powerful number crunching abilities through a spreadsheet paradigm. As with many small technical companies, they have the problem of trying to differentiate between technology and a business solution.

Let’s start with the good side, the engine seems very cool. They use distributed technology in the Cloud to provide the ability to rapidly filter through very large data sets. It’s no surprise that their primary markets seem to be CPG and financial companies, as they are dealing with high volumes of daily data. It’s also no surprise because they must have very technical business users who are used to looking at data via spreadsheets.

The biggest problem is that spreadsheets are ok for looking at raw data, but not for understanding anything except a heavily filtered subset of it. That’s why the growth of BI has been in visualization. Everything 1010data showed us involved heavy work with filters, functions, XML and more. The few graphics they showed look twenty years out of data and quite primitive by modern standards. This is a tool for a power user.

Another issue, showing the secondary thought given to re-use and display of information is their oxymoronic QuickApps. As the spreadsheet analysis is done in the cloud on the live data set, if someone wants to reuse the information in reports a lot of work must be done. The technical presenter was constantly diving into complex functions and XML code. That’s not quick.

When asked about that, the repeated refrain was about how spreadsheets are everywhere. True, but the vast majority of Microsoft Excel™ use no functions or the very simplest of sum() and a few others. Only power users create major results and BI companies have grown over the move from Excel to better ways of displaying results.

I must question whether CEO and Co-founder Sandy Steier understands where the company fits into the BI landscape. He constantly referred to Cognos and MicroStrategy as if they’re the current technology leads in BI. Those solutions are good, but they are not the focus of conversation when talking about the latest in visualization or in-memory technologies. The presentation did have one slide that listed Tableau, their web site was devoid of references to the modern generation (or it was well hidden). Repeated questions about relationships with visualization vendors were turned off to other topics and not addressed.

Of key focus was an early statement by Mr. Steier that data discovery is self-service reporting. There seems to be the typical technical person’s confusion between technology and business needs. Data discovery is the ability to understand relationships between pieces of data to build information for decision making. Self-service reporting is just one way of telling people what you’ve discovered. Self-service business intelligence is a larger issue that includes components from both.

I very much liked the technology but I must question if the management of 1010data has the vision to figure out what they want to do with it. Two, of many possible, options show the need for that choice. First, they can decide to be a new database engine, providing a very powerful and fast data repository from which other vendors can access and display information. Second, they can focus on adding real visualization to help them move past the power users so that regular business users can directly leverage their benefits. The two strategic choices mean very different tactics necessary for implementation.

To summarize: I was very impressed with 1010data’s technology but am very concerned about their long term potential in the market.

ETL across the firewall: SnapLogic at the BBBT

SnapLogic presented at the BBBT last Friday. I was on the road then so I watched the video today. The presentation was by Darren Cunningham, VP Marketing, and Craig Stewart from product management. It was your basic dog and pony show with one critical difference for the BI space, they understand hybrid systems.

Most of the older BI vendors are still on-premises and tip-toeing into the Cloud. Most of the newer vendors are proudly Cloud. The issue with enterprises is that they are clearly in a strong, hybrid situation with a very mixed set of applications within and outside the firewall. Companies talk about supporting systems in a hybrid system, but you dig down and find out it’s one way or the other, with minimal thought given to supporting the other half.

Darren made it clear from the beginning that SnapLogic understands the importance of a truly hybrid environment. They are, ignoring all the fancy words, ETL for a hybrid world. They focus on accessing data equally well regardless of on which side of the firewall it resides. Their partner ecosystem includes Tableau, Birst and other BI vendors, while SnapLogic focuses on providing the information from disparate systems.

Their view was supported by a number of surveys they’d performed. While the questions listed had the typical tilt of company offered surveys, they still provided value. The key slant, that has implications for their strategic planning, is shown by one survey question on “Technical Requirements of a Cloud Integration Platform.” “Modern scalable architecture” came in first while “Ease of use for less technical users” was third.

As Claudia Imhoff accurately pointed out, the basic information might be useful, but it’s clear om their presentation that this was an IT focused survey and should be treated as such. It would be interesting to see the survey done similarly for both IT and business users to see the difference in priorities.

SnapLogic looks like they have a good strategy, the thing to watch is how they grow. The key founder is Gaurav Dhillon, one of the founders of Informatica. He had a good strategy but was replaced when the company grew to a point where he couldn’t figure out the tactics to get over the Chasm (full disclosure: I worked at Informatica in1999, when it hit the wall. I’m not unbiased). Let’s hope he learned his lesson. There’s a clear opportunity for SnapLogic’s software, and it seems to be going well so far, but we’ll need to watch how they execute.

TDWI: Evolving Data Warehouse Architectures in the Age of Big Data Analytics

Today’s TDWI webinar was a presentation by Philip Russom, Research Director, on “Evolving Data Warehouse Architectures in the Age of Big Data Analytics.” It was based on his just released research paper of the same name. Overall, it was a good overview of where their market survey shows the market, but I have a nit.

One of the best points Mr. Russom made was during his discussion of responses to “What technical issues or practices are driving change in your DW architecture. The top response, with 57%, was Advanced Analytics. Phillip pointed out to everyone that’s about as good a proof as you get in business driving technology. Advanced analytics is a business need and that pushing the technical decision. Too many people get too wrapped up in technology to realize it exists to solve problems.

TDWI what is data warehouse architectureAnother key point was made as to the evolving nature of the definition of data warehousing. Twenty years ago, it was about creating the repository for combining and accessing the data. That is now definition number three. The top two responses show a higher level business process and strategy in place than “just get it!”Where I have a problem with the presentation is when Mr. Russom stated that analytics are different than reporting. That’s a technical view and not a business one. His talk contained the reality that first we had to get the data, now we can move on to more in depth analysis, but he still thinks they’re very different. It’s as if there’s a wall between basic “what’s the data” and “finding out new things,” concepts he said don’t overlap. Let’s look at the current state of BI. A “report” might start with a standard layout of sales by territory. However, the Sales EVP might wish to wander the data, drilling down and slicing & dicing to understand things better by industry in territory, cities within and other metrics across territories. That combines what he defines as separate reporting and data discovery.

Certainly, the basic row-based reporting can switch to columnar structures for better analytics, but that’s techie. The business user sees a simple continuum. Like most other areas where technology advances (pretty much all of life…), business software solves a problem and then allows people to see past that one to the next. That successor problem doesn’t have to be completely different, and in the case of reporting and analytics, it’s not.

The final takeaway I received from his webinar helped support that concept even if his earlier words didn’t. He talked about the Multi-Platform Data Warehouse Environment, the fact that DW’s aren’t going anywhere, they’re only being incorporated into a wider ecosystem of data technologies in order to continue to improve the understanding and decision making capabilities of business managers.

Other than the disagreement I have with his view of reporting and analytics, I heard a good presentation and suggest people check out the full report.

Qlik at BBBT. QlikView Expressor: Come for the ETL, stay for the Data Governance

Last Friday’s BBBT presentation was by Qlik and the primary purpose was to discuss QlikView Expressor, but that was just a foundation for what really caught my eye, a great advance in data governance.

Qlik bought Expressor Software last June and the presentation was the chance to show the re-branded products to the analysts in the group. Expressor bring baby ETL to Qlik. The presenters, Donald Farmer and Bill Kehoe were very honest and clear that the product is intended for those who start with basic self-service BI and find they need to get to multiple sources as they begin to expand their use. I’ll be clear, this is a baby product. Their public case study was, according to the slides, using ODBC and Salesforce.com’s very open API. How they can, and even if they should, handle access to more complex and proprietary systems remains a big question.

As Informatica and other major players in the ETL space have strong partnerships with Qlik, it’s a careful game Qlik has to play. On one side, they have to provide some basic ETL functionality to a key portion of their market, on the other side they have to not alienate the big players. Often products acquired in such a middle ground either fail from the lack of a clear solution or cause problems with partnerships, but only time will tell how Qlik will handle this. For the time being, I don’t see this product being a threat to their partners.

The presenters waffled the early message about ETL being a way to govern access to data and why became very clear as the presentation entered the second section. QlikView Expressor is being used as a component driving Qlik’s new QlikView Data Governance Dashboard. The company has done an amazing job at blending ETL, their existing and well known BI presentation software, and a smart overview of the full architecture to take a very good step forward in helping companies understand where their data is being used.

As Donald Farmer pointed out, only half humorously, “Microsoft Office for the iPad has killed data governance.” KPIs defined in multiple departments, different reports on different computers and the growth of laptops made data governance difficult in the previous decades. The boom in tablet use has expanded that challenge exponentially. Having the leading business productivity suite now available on the leading tablets means company reports, spreadsheets and more are spread even further through the business ecosystem. Data governance becomes vastly more difficult to achieve.

The Data Governance Dashboard is a first step in helping IT and business users understand what information is out there, where it’s residing and how much of it is used how often.

This isn’t a blog about the features, but one must be mentioned because it is critical. Knowing what data fields are being accessed by what BI reports is important by itself, and Qlik and others have been looking at that. The extension that matters is their reports that begin to link the report labels users see to the internal fields. Think about the ability to see that two divisions both use the label “Gross Profit” and then understand that they’re using different fields, definitions and underlying data to create the displayed numbers.

Self-service BI is reaching the point where desktop computers were in the early 1990s. The 1980s saw business users run away from what they saw as a controlling IT infrastructure. It helped and it created confusion. IT and business had to find new ways to work together, reintegrate and help both sides help the company. The Governance Dashboard is something that can help lead the BI industry into that same step to help both sides provide improved and consistent information to business decision makers. Well done.

TDWI Webinar – Preparing Data for Analytics with Liaison Technologies

Tuesday’s TDWI Webinar was titled “Preparing Data for Analytics,” but that’s not what it was about. It was still interesting, just misnamed. The focus was, as would be expected with Liaison being the sponsor, about how managing data in the Cloud can enhance the ability of some companies to support BI.

It started with Phillip Russom, an in-house TDWI analyst talking a bit about preparing data without ever using the words extraction and transformation. The most interesting point he made was an aside in the main presentation, but one that should be brought to the fore: What’s SMB?

Most of us think of SMB as Small-to-Medium sized Businesses. His point was that it’s really Small-to-Medium sized Budgets. Many folks involved in enterprise software understand that you often don’t sell an enterprise license up front. A software vendor will sell to a department or a business division with a smaller budget that needs to get something done. Then the tactic is to expand within the enterprise to build it into a major account. Mr. Russom makes the great point that the tactics to sell into the smaller groups in an enterprise are very similar to those uses to sell to smaller businesses, so maybe there are opportunities being left on the table by smaller vendors.

His other key point needs a webinar of its own. He mentioned that companies looking for Cloud analytics should “make sure Cloud solutions are optimized for data and not for applications.” It’s the data that’s important, how you get it and how you prepare it. That’s what has to be supported first, then applications can access the data. Sadly, he said that and moved on without any real details on what that preparation means. I’d like to see more details.

The main speaker was Alice Westerfield, Sr. VP of Enterprise Sales at Liason Technologies. Her main point followed Russom’s lead in by pushing that a good analytics platform requires moving from an application centric approach to a data centric one. No surprise, Liaison has one handy. Most importantly, it’s a Cloud approach, one they’ve been offering before Cloud became the buzzword.

Alice was brief but focused on their history of supporting integration in the Cloud. The four main benefits she mentioned were:

  • Data Integration
  • Data Transformation
  • Data Management
  • Data Security

That makes sense and we all know how important the last one, security, matters before people are willing to perform the first three on the Cloud. However, it’s the changing nature of the data game that means I want to focus on the first, data integration.

While Liaison talks about the benefits of leveraging their years of integration skills rather than reinventing or needing to take new integrations and install them in an on-premises solution, there’s another Cloud aspect that I think is critical. Most businesses use a mix of applications and many are already on the Cloud. Add to that the mobile nature of today’s generation of BI solutions which are provided in the Cloud. It makes sense for many SMBs to leverage that. Why take data from Cloud apps, move them on-premises and then move them back to the Cloud? Using a service such as Liaison simplifies and speeds the process of meshing data from within and outside of the firewall and then providing wide access to knowledge workers through the new BI interfaces.

For the foreseeable future, there will continue to be reasons for keeping data within the firewall, but for most data and most companies, a solution such as Liaison’s would seem to be an opportunity to quickly integrate data and share it as broadly as required.

The Webinar That Couldn’t: Modernizing the Traditional BI Environment

Years ago, I was a programmer. Then folks started calling themselves software engineers. Of course, programming isn’t engineering. The best programmers combine science, engineering and art as do all good craftsmen, but words such as engineer and scientist hold an allure (and the potential for more money) to some people who aren’t as confident as they should be, hence such modern titles as Sanitation Engineer.

Why does that line of thought come to mind? I spent an hour yesterday listening to a TDWI webinar where one focus was on the Data Scientist when it should have been on business intelligence. The presenters were independent analyst Colin White and IBM Product Marketing Manager Brent Winsor. The typical analyst & company presenter webinar covers a key theory impacting a market and then how a specific product addresses that issue or need. Not so on Thursday.

My primary issue is the artificial split in the claim that BI concerns itself with descriptive and diagnostic analysis while data science looks towards predictive and prescriptive analysis. The main point of many of the new analytics for data mining, created by people calling themselves data scientists, is to understand patterns in large data sets. That’s clearly descriptive in nature.

Another reason that the line is artificial is a key and valid point Colin White stated as “much of predictive analytics is looking at historical data.” Numerous vendors are beginning to provide predictive analytics by starting with a base of understanding past performance and providing analytics to do what-if planning by seeing what happens as incoming data parameters are modified to simulate different conditions. If Mr. White had spent more time discussing how this evolutionary change is happening rather than inventing stalking horses, it would have been a much better presentation.

Brent Winsor’s presentation had other problems. On its own, it became rapidly clear that Mr. Winsor was reading a script rather than talking with us. That could be what happens when a marketing organization in a large company decides that, rather than helping a presenter, too much must be controlled and the “help” turns into an impediment. From what I could glean in the Q&A, Brent is sharp, knows his stuff, and can relate to the audience; but he was stifled throughout the formal presentation.

The bigger issue, though, was the Brent Winsor focused on self-service BI. While that’s of interest, I pointed out at the beginning how the consultant and company presentations should mesh. One being about data science and the other about self-service BI was not a flow that leads to a powerful overall webinar.

The only real nit I have with his presentation on its own is the issue of self-service. That is empowering the business user to create and manipulate reports and charts to wander the data without requiring IT or others to created new output for them. It makes sense and something that IBM Cognos and its competitors are enabling in the market place. However, just because self-service is important doesn’t mean everyone using the tool is interested in or availing themselves of self-service.

One view of that was Brent’s discussion of the CxO suite, folks who often don’t want to do anything, or much, to the information, they just need the high level overviews provided by their managers and business analysts so they can synthesize their own conclusions. Many only need to glance at a single drill down or other reporting capabilities in the existing, traditional tools. However, in an attempt to expand the realm of self-service to every user, Mr. Winsor and IBM have created the category of “managed/scripted” self service. If your access is being managed or if you are following data flow scripts being created by others, you aren’t self-service. We don’t have to put everything in the latest marketing basket. Brent’s points about the spectrum of BI users was dead on, I only think it was a stretch to make all users self-service users.

Not every webinar can be perfect, nor do I claim perfection for my own; but the webinar was so far from it that I came away very disappointed.

TDWI Webinar – BI in the Cloud

Today, TDWI held a webinar on BI in the Cloud. The simple summation: It’s slowly gaining a foothold, but it’s early.

The presentation was a tag team between Fern Halper, Research Director, Advance Analytics, and Suzanne Hoffman, Sr. Director, Analyst Relations at Tableau Software, the webinar’s sponsor. Fern Halper’s focus, along with plugging her books, was that she sees people beginning to turn the corner in understanding and using the Cloud for BI. A number of indicators of that were slides of survey results from the recent TDWI conference. One slide pointed to results shown 25% of attendees rejecting the public Cloud, at least in the short term and 36% don’t yet know. That means only 39% are either already using it or planning on using it. It’s growing, but not yet the norm.

Another key aspect of her talk was on a subject many people don’t consider until it’s too late. While people looking at the Cloud focus on the risks of getting on the cloud, such as security and compliance issues, there are issues related to a key concern that exists regardless of where your data resides: Vendor choice.

Your costs have gone up more than expected. Another vendor comes along with features you really need. What do you do? It’s hard enough to migrate between applications that you manage on premises. When your data is hosted by others, what is the access scenario? How will you get your data from one system to another? Decision makers should be planning exit strategies as part of the purchase decision.

Suzanne Hoffman’s segment covered, at a much higher and briefer level, the content of the BBBT presentation I’ve previously described. Due to that briefness and a question from one attendee, I learned something I missed the other day. Tableau Online is the server only. Anyone accessing it must have at least one desktop version to work with it to set up database relations. In this era, where more and more companies are recreating their interfaces using HTML5 in order to provide full functionality anywhere, it’s interesting that a company often described as a disruptive technology is lagging in this respect. This isn’t a problem in the short term, as delivery on multiple devices is still provided and it’s not much of a problem setting up the links on a desktop, but it’s something to watch.

Companies are tiptoeing to the Cloud, and Fern Helper’s presentation shows us the momentum building. We haven’t reached an inflection point, and I’d be surprised if we see one in the next 12-18 months, but it’s good to see TDWI keeping an eye on things and giving us sign posts on the way.