The Webinar That Couldn’t: Modernizing the Traditional BI Environment

Years ago, I was a programmer. Then folks started calling themselves software engineers. Of course, programming isn’t engineering. The best programmers combine science, engineering and art as do all good craftsmen, but words such as engineer and scientist hold an allure (and the potential for more money) to some people who aren’t as confident as they should be, hence such modern titles as Sanitation Engineer.

Why does that line of thought come to mind? I spent an hour yesterday listening to a TDWI webinar where one focus was on the Data Scientist when it should have been on business intelligence. The presenters were independent analyst Colin White and IBM Product Marketing Manager Brent Winsor. The typical analyst & company presenter webinar covers a key theory impacting a market and then how a specific product addresses that issue or need. Not so on Thursday.

My primary issue is the artificial split in the claim that BI concerns itself with descriptive and diagnostic analysis while data science looks towards predictive and prescriptive analysis. The main point of many of the new analytics for data mining, created by people calling themselves data scientists, is to understand patterns in large data sets. That’s clearly descriptive in nature.

Another reason that the line is artificial is a key and valid point Colin White stated as “much of predictive analytics is looking at historical data.” Numerous vendors are beginning to provide predictive analytics by starting with a base of understanding past performance and providing analytics to do what-if planning by seeing what happens as incoming data parameters are modified to simulate different conditions. If Mr. White had spent more time discussing how this evolutionary change is happening rather than inventing stalking horses, it would have been a much better presentation.

Brent Winsor’s presentation had other problems. On its own, it became rapidly clear that Mr. Winsor was reading a script rather than talking with us. That could be what happens when a marketing organization in a large company decides that, rather than helping a presenter, too much must be controlled and the “help” turns into an impediment. From what I could glean in the Q&A, Brent is sharp, knows his stuff, and can relate to the audience; but he was stifled throughout the formal presentation.

The bigger issue, though, was the Brent Winsor focused on self-service BI. While that’s of interest, I pointed out at the beginning how the consultant and company presentations should mesh. One being about data science and the other about self-service BI was not a flow that leads to a powerful overall webinar.

The only real nit I have with his presentation on its own is the issue of self-service. That is empowering the business user to create and manipulate reports and charts to wander the data without requiring IT or others to created new output for them. It makes sense and something that IBM Cognos and its competitors are enabling in the market place. However, just because self-service is important doesn’t mean everyone using the tool is interested in or availing themselves of self-service.

One view of that was Brent’s discussion of the CxO suite, folks who often don’t want to do anything, or much, to the information, they just need the high level overviews provided by their managers and business analysts so they can synthesize their own conclusions. Many only need to glance at a single drill down or other reporting capabilities in the existing, traditional tools. However, in an attempt to expand the realm of self-service to every user, Mr. Winsor and IBM have created the category of “managed/scripted” self service. If your access is being managed or if you are following data flow scripts being created by others, you aren’t self-service. We don’t have to put everything in the latest marketing basket. Brent’s points about the spectrum of BI users was dead on, I only think it was a stretch to make all users self-service users.

Not every webinar can be perfect, nor do I claim perfection for my own; but the webinar was so far from it that I came away very disappointed.

TDWI Webinar – BI in the Cloud

Today, TDWI held a webinar on BI in the Cloud. The simple summation: It’s slowly gaining a foothold, but it’s early.

The presentation was a tag team between Fern Halper, Research Director, Advance Analytics, and Suzanne Hoffman, Sr. Director, Analyst Relations at Tableau Software, the webinar’s sponsor. Fern Halper’s focus, along with plugging her books, was that she sees people beginning to turn the corner in understanding and using the Cloud for BI. A number of indicators of that were slides of survey results from the recent TDWI conference. One slide pointed to results shown 25% of attendees rejecting the public Cloud, at least in the short term and 36% don’t yet know. That means only 39% are either already using it or planning on using it. It’s growing, but not yet the norm.

Another key aspect of her talk was on a subject many people don’t consider until it’s too late. While people looking at the Cloud focus on the risks of getting on the cloud, such as security and compliance issues, there are issues related to a key concern that exists regardless of where your data resides: Vendor choice.

Your costs have gone up more than expected. Another vendor comes along with features you really need. What do you do? It’s hard enough to migrate between applications that you manage on premises. When your data is hosted by others, what is the access scenario? How will you get your data from one system to another? Decision makers should be planning exit strategies as part of the purchase decision.

Suzanne Hoffman’s segment covered, at a much higher and briefer level, the content of the BBBT presentation I’ve previously described. Due to that briefness and a question from one attendee, I learned something I missed the other day. Tableau Online is the server only. Anyone accessing it must have at least one desktop version to work with it to set up database relations. In this era, where more and more companies are recreating their interfaces using HTML5 in order to provide full functionality anywhere, it’s interesting that a company often described as a disruptive technology is lagging in this respect. This isn’t a problem in the short term, as delivery on multiple devices is still provided and it’s not much of a problem setting up the links on a desktop, but it’s something to watch.

Companies are tiptoeing to the Cloud, and Fern Helper’s presentation shows us the momentum building. We haven’t reached an inflection point, and I’d be surprised if we see one in the next 12-18 months, but it’s good to see TDWI keeping an eye on things and giving us sign posts on the way.

Tableau Software at the BBBT

Years ago, I had a great boss. Very smart, good people manager, understood strategy and tactics. There was one flaw. One joke was that head never use a sentence if a paragraph would suffice. That showed most clearly is the voluminous number of slides that would be part of any presentation.

He comes to mind because I attended Tableau Software’s presentation to the Boulder BI Brain Trust (BBBT). 93 slides. Yes, 93. Even if folks were not familiar with BI, I think the presentation would have been a bit, shall we say, extensive. Just like that boss, the presenters had a lot of very interesting to say, but finding it was often hidden in too much explanation. So I don’t make the same mistake while describing the presentation, here are my key takeaways:

  • Balanced on-premises and Cloud strategy.
  • Strong forethought about changing relationship between IT & business users.

Everyone even vaguely involved in BI should already be familiar with the basics of Tableau. It is one of the leaders in the latest generation of BI solution providers. They are known for their visualization skills yet haven’t scrimped at the technology needed to manage large data sets. They had a tag team of four presenters for the presentation. The first presenter was Suzanne Hoffman, Sr. Director of Analyst Relations, who gave the overview of company growth since last year’s BBBT presentation.

Francois Ajenstat, Sr. Director, Product Management, then followed with, as would be expected from product management, a review of products including a discussion of v8.1. In a review of a number of things, the one item that caught my attention was the extension of their Cloud packages. It was clearly stated that the product wasn’t redesigned for the Cloud, just moved outside the firewall to a hosted environment, but that’s a fine first step considering the rest of the work they’ve done. What’s critical is they’ve thought through the Cloud business model and how that is different than what’s needed for purchased software.

The one issue I have, from an analyst perspective, was the very brief discussion of the supposed success and early adoption of their 8.1 offering. While describing the number of companies who have rapidly adopted it, there was no breakdown. How many were new customers versus old? Were they larger or smaller companies? While it is to be expected that newer companies and departments, as well as smaller companies, would be the first to adopt, and while that’s not necessarily an issue, clarity would have helped.

In a look ahead about v8.2, which we were told is not under NDA was a look at what seems to be an extremely professional Mac port. It’s not just moving what was on MS Windows to the Mac, as many companies do. It looks like a product designed for the Mac that will make very happy any user of that platform. While Francois did a good job explaining why they wanted to port to a platform that has 10% of the business market, I think Tableau should be more explicit on who that 10% is and why that matters. Marketing. Yes, I know, I’m biased towards marketing, but think about the fact that marketing was the home of the Mac during the lean years. Why? Because the Mac has always been better for visualization, so marketing continued to use the platform for creating ads, graphics, charts and more. Providing a great Mac port enhances the ability of marketing to serve good analytics and presentations to the rest of the firm.

The presentation was then handed to Robert Kosara, a research scientist, to talk about visual storytelling. Here, again, there were too many examples and a blurred message. While trying to describe how the concept was different than the visualization theories that have happened from Tufte onwards, the difference wasn’t made very clear. There seemed to be two main takeaways. First, they’re only describing the fact that modern BI allows isolation of and drill down into segments of the visualization, things that aren’t possible in old-line static print medium. That seems intuitively obvious rather than a giant leap forward. The second part of it isn’t with theory but with some of Tableau’s plans (again, we were told that this level isn’t under NDA) to provide technology that merges visualization with presentation layout, where the application will be able to take the equivalent of slides of the visualization to increase the presentability of analysis. That’s still early in work but could be very interesting as it matures.

The final presenter was Ellie Fields, VP of Product Marketing. Maybe I’m biased, as my background is in product marketing, but I wish this section had not been last. The most important thing is her presentation was what I think best brings out why Tableau is a leader. Her section focused on the changing environment for BI implementations. Everyone in the industry acknowledges the power of the BI front-end is giving business users more and more ability to be involved in creating and managing their own analytics. We know that IT is going to have to change from a more controlling focus to one of service. Ellie Fields has thought this through and clearly defined how the change doesn’t obviate the need for IT but only changes the relationship between the two sides. The most important slide in the deck is the one below, where a high level description of that partnership is presented.

Tableau IT and Biz partnership

The past growth of Tableau Software has clear reasons. It was a good technology introduced to the market at the right time. What they’ve done so far is a sign of management’s ability to understand an early market. Continued growth is across Moore’s Chasm and a longer term strategy is critical. In my (probably not so humble) opinion, while technology is critical there are far too many technology companies that failed with great products. It’s the vision shown in both Francois Ajenstat’s discussion of managing both on-premises and Cloud markets combined with Ellie Field’s understanding of how to position the company to help businesses through a major change in how IT and business users will relate that show the strength of the company moving forward.

I just wish we heard that information bubbled and  more visible rather than lost in a bunch of other words that weren’t really needed.

Kalido: A solution in search of the message?

I had the fortune to see Kalido presentations twice in two days. First, was the Qlik road show event on Thursday and second was the Boulder BI Brain Trust call on Friday.

Kalido provides a streamlined way to create and manage data warehouses. The key seems to be a strong data modeler linked to the engine, providing a more graphical way to define processes and link that to the automatic creation of the physical data layers and data warehousing management. According to their case studies, the result is a significant savings in time to deploy and manage warehouses. As they pointed out in the BBBT presentation, the deployment savings is a clear and compelling argument but the longer term saving in ongoing operational costs is one they haven’t yet successfully attacked.

That ties in to the issue of their major message to the Qlik audience and on their web site: “No ETL!” As anyone who understands their technology knows, and as they pointed out in their BBBT presentation, ETL is one component of their solution. The presenter on Thursday tried to claim it’s not ETL, it’s ELT, because they use a temporary data store to more quickly extract information from operational systems, but that’s not going to cut it. ETL is still performed even if in a slightly different order. IT people will understand that and laugh at the claim while most BI business users won’t know what that means and the rest won’t care as it’s not a major concern to those people trying to get information out of the warehouse.

Operational costs matter to both IT and the business line managers. As many IT centers internally “bill” divisions for costs, that will still have an impact and matter to both sides more than a specious ETL message.

More importantly, the ability to change your business model and have it rapidly be reflected in the data warehouse is of strong value to the decision makers. The ability to eliminate 6-9 months of rework before a change is done only to see the changes now be out of date has a clear and compelling message for business decision makers. The ability to rapidly satisfy business users in changing markets while using less IT resources is valuable to the IT organization.

So why does the message seem to be missing a great market focus opportunity? One possible answer is found on their executive team page. Rather, it’s what’s not found. A company that wants to leave the startup phase and address a wider market would do well to emphasize marketing with the same importance as engineering and professional services. Products aren’t enough, you have to create messages that address what interest stakeholders. Kalido seems to have a very good product, but they aren’t yet able to create messages to address the wider market.

Co-opetition: A food industry example

I’m reading an AMA periodical and an article on Chipotle has a brilliant bit about co-opetition. In case you haven’t heard the term, it was popularized by the 1997 book of the same name. The concept is that complex markets see companies act as both competitors and partners depending on what’s happening.

The relevant section is when the author is describing that Chipotle wanted to only buy thigh meat from a vendor but that wasn’t economical for either company. “But it just so happened that Panera Bread used only chicken breast meat in its recipes, so Chipotle approached its competitor with the idea of working together to more economically source all-natural chicken meat for both restaurant chains. Previously, Bell & Evans mostly supplied high-end restaurants willing to pay a significant price premium. With the Chipotle/Panera cooperation, new economies emerged that didn’t exist before, allowing fast-casual chains to afford to purchase the naturally raised meat and still be profitable.

Chipotle’s management wasn’t scared about contacting a competitor with a suggestion that would help both companies. That’s good management.

Salesforce, BI, analytics and Birst

This is the third and final blog in my series about last week’s Salesforce1 event in Philadelphia.

I’ve discussed the growth of the Salesforce ecosystem and how the company is bringing back the integration because SFA, CRM and help desks to ensure customer facing systems work together. They also have partners with applications such as accounting, quoting and other add-ons to the full process. The final big issue is how that needs to change reporting.

It’s easy to create charts and desktops for single applications focused on a department or function. That’s not why business intelligence exists. Upper management needs to better understand how their organizations are performing, and that means analyzing information that comes from multiple systems. BI’s goal is to provide actionable information about the business. That’s why it grew alongside data warehouses that gathered information from disparate systems. Now that Salesforce is clearly creating an integrated environment, are their analytics growing apace? Sadly, no.

“Analytics Unleashed” was the key session that interested me above all others. However, to paraphrase Gertrude Stein, there wasn’t much there there. It looked like simple reports slapped into basic dashboards with no real drill down or data discovery capability. Admittedly, the major part of their demonstration didn’t work and it might have been in that, but I didn’t see anything on in the exhibit hall to make me think there was.

Well, not from Salesforce directly, but don’t be without hope. I’ve talked about the ecosystem they’ve clearly built with Salesforce1, and that comes with advantages. One of those advantages is Birst, a business intelligence firm.

Birst is a Salesforce partner who, unlike a number of other partners at the road show, isn’t exclusively focused on Salesforce. They are in the new generation of BI firms who are working to modernize the market. They work to eliminate the need to have separate ETL and BI vendors, providing a platform to integrate that then directly supports leading edge analytics capabilities.

That ability combines with a SaaS architecture to allow Birst to work well with Salesforce1. I talked with them at their stand and the integration seemed clean from a UI standpoint. I’d have to dig deeper to understand how well Birst helps Salesforce integrate with other systems, but things look hopeful.

Salesforce seems to be viewing BI the same way as the other large companies. Even with their more modern history, advanced SaaS architecture and appropriately Web nimble interfaces, Salesforce still thinks of analytics as reporting. If “we” get all applications under our control, we just extend our reporting. That’s not BI. Analytics is a more advanced way of providing better information for decision making.

At some point, I’m sure Salesforce will come to that understanding, and then who knows what will happen. Until then, Salesforce customers can still access the next generation of BI and analytics through the ecosystem that allows Salesforce and Birst to work in concert.

Salesforce1 and “The Internet of Customers”

This is the second part of a series on the Salesforce1 road show held in Philadelphia on March 6, 2014.

One key point repeated throughout the road show event was that behind every device in The Internet of Things is a customer. Good point. The question is if Salesforce is beginning to truly address that and not just prospects.

In the early 1990s, I worked at Aurum Software, one of the companies that created the market. In those days, all the companies, Scopus, Vantive and a few others were trying to address the full customer interface. We had a single database and built early SFA, CRM and customer support applications on top of the database. Unfortunately, neither the hardware or software allowed for both quick development and the number of users supported from all the branches. In addition, since the areas were all pretty new, most prospects wanted to start with one of the three applications. That caused the three branches to split up and they’ve been separate for most of the last twenty years.

Salesforce is one of the companies working to bring it all together again. As one of the oldest and most successful of the SaaS companies, they’ve been focused on the power of the cloud and how to expand. In the road show, they did a good job of showing SFA, CRM and customer support applications working in concert to benefit companies wanting to understand the true picture of their interactions with their customers.

The company has acquired ExactTarget and Pardot to provide the confusing pair of automated marketing and marketing automation. ExactTarget provides list management, schedule and automated distribution of emails (automated marketing). Pardot helps marketing manage campaigns and lead through those campaigns (marketing automation). I was very impressed by both the Pardot software and team. The software works well on its own and has tight integration with Salesforce SFA, though there’s still work to do in order to show good closed-loop reporting.

On the support side, Salesforce advertises their Service Cloud, supposedly powered by Desk.com. I’m not as impressed by this as I am the CRM, but the strategy is in place. In addition, having designed for the internet, SaaS has a good partner ecosystem with a number of products and SI’s at the roadshow to help add service to the other customer facing systems.

The technology is finally in place to build integrated systems that can give management a view of all their customer interactions, and Salesforce has the strategy in place to achieve that. However, as they’ve been focused on acquiring the pieces and building the integration, the one piece still missing is true business intelligence, and that will be the focus of my next post.

Salesforce1 Road Show: Overall impression

Yesterday I attended the Saleforce.com event in Philadelphia, a road show for Salesforce1. Over the next week, I’ll cover some specific topics of interest from the event, but this post is only an overview of my general impressions.

Salesforce1 is presented officially as their new platform. As they say, “One Customer Platform to Connect Everything.”  It’s their product, so the key messages have to focus on them, but the underlying message was even stronger.

While again and again they explicitly bragged about how the platform was great for mobile, and sometimes you might have gotten the impression that that was the only reason for the new platform, there was an implicit message that ran through the event.

Saleforce began as a simple sales system aimed at SMB. It allowed only minor customizations and was very attractive to companies that just needed something that worked rather than something that reflected a large company’s unique selling practices. That was intentional and brilliant for two reasons. First, the SMB market was vastly underserved, with the existing SFA companies focused in enterprise. Second, it was the best way to build a SaaS business.

As the company grew, it became clear that the decision wasn’t tactical but rather it was very strategic. Saleforce worked very hard to expand past that start. SaaS matured, hardware and software became faster, more was possible. The company began expanding its offerings to provide customization and features needed for enterprise.

At the same time they worked with many companies in partnerships, as is the norm in our industry. While they have acquired many companies, such as ExactTarget and Pardot, they continue to work with other partners.

Every enterprise software company wants to brag about partnerships, but often their user conferences and roadshows are all about them in the corporate presentations, with only an exhibit hall or a couple of presentations focused on the ecosystem.

The road show, while based on the Salesforce1 platform, was all the more powerful because it avoided that wall. Every presentation I saw not only talked about Salesforce1 but gave examples of both acquired companies and partner solutions working within the Salesforce1 environment. Salesforce is pushing an ecosystem, not just a platform. That’s a powerful business message and it was done very well.

SFA & CRM for BI: Sales and Marketing redux

A few weeks back, I blogged about the necessity for sales and marketing to work in a symbiosis rather than a power struggle. This post is about what that means for SFA and CRM.

In the early 1990s, I worked at a startup named Aurum Software. It was one of the first companies to provide CRM solutions. On a humorous tangent, I’m happy CRM won. Our founder came from the ERP space and he had us call the solution Customer Resource Planning no matter how many times we told him CRP just wasn’t as polite an acronym as competition’s CRM.

One of the keys to the early success of Aurum and its competitors is that the core databases and software were set up to manage all customer facing applications. That mean we had sales, marketing and help desk front ends using the same database, so management could see a full picture of customer interaction.

Unfortunately, it was at a time when both software and hardware were much slower and CRM, as it still is, was very data intensive. To get people going, most ISVs concentrated on one of the three aspects and the split between CRM and SFA was born.

The result, over the past twenty years, was that the two systems rarely talked well with each other. Sales personnel would track their prospects through the funnel, marketing would track prospect and customer touch points in communications, but they were rarely linked and poorly linked if anything.

I worked with multiple companies who couldn’t sales and marketing information with any degree of business intelligence. One would just import the SFA’s first contact and that would always be the sole “lead source” tracked. Another did the opposite, the first campaign that ever received a ping from someone at a company was always considered to be the lead source for campaign analysis. It didn’t matter at either firm that the contact was a couple of years ago and a recent contact (sales or marketing) restarted the process.

One company tried to close the loop but didn’t know how. Marketing tracked campaigns in their CRM system and didn’t in their SFA system. A lead would be passed to sales, then that information would come back and the campaign information was no longer there and the loop couldn’t be closed.

What’s needed is for both sales and marketing to realize that BI requires a tighter link. Fortunately, the power of systems has advanced that better integration is starting to happen. In a bit of foreshadowing, I’ll say that I’m exciting to be going to Salesforce.com’s Salesforce1 World Tour tomorrow. Salesforce burst onto the scene years ago with a simple SaaS SFA platform, but it was clear they always intended more. Now, with the growth of their own CRM offerings, it’s starting to come together. I’m interested in seeing more, asking more, then blogging more.

 

BI: IT becoming shop keepers for business users

One thing everyone is discussing is the changing nature of the IT/Business manager relationship with modern BI solutions. Many folks have talked about “enabling” the end users or used other similar terms.

Late last year, QlikTech and Informatica presented a three part webinar on the state of BI. I don’t remember if it was Donald Farmer of QlikView or David Lyle of Informatica who said the phrase that stuck with me, but it was evocative: IT is changing from gate keeper to shop keeper.

The goal of IT in today’s BI world is to stop being perceived as a barrier to business information and turn to being someone who quickly provides what the customer needs. I’d even suggest the shop is a grocery store. The business user will check the tools IT provides, applets, reports, basic dashboards and similar components, then have the ability to combine them, tweaking as necessary, to create a meal appropriate enough for the specific appetite.

What I like about the analogy is that it doesn’t denigrate the importance of IT. Business users aren’t doing a run around, they’re being provided core tools by an IT organization that’s doing a lot behind the scenes, just as the grocer is very busy to ensure the products are on the shelf and the store functions properly. It’s only that it’s a time sink for both IT and the user if the wrong is preparing the meal, the user knows better what she wants so why spend so much time trying to translate for IT. Both sides have a lot of work that needs their skills, finding the right level of interaction is a boon to both.