Author Archives: David Teich

Review: TDWI Advanced Analytics Best Practices Report and Webinar

This Tuesday, Fern Halper of the TDWI gave a talk on next generation analytics in order to push the latest TDWI report on the topic. I’ll be bouncing between the webinar and the report during this blog entry, but the report is the source for the webinar so use it as the basis. While there were some great nuggets in the study, let’s start off with the overblown title.

As Ms. Halper notes in the executive summary of the report, “Next-generation platforms and analytics often mean simply pushing past reports and dashboards to more advanced forms of analytics, such as predictive analytics.” In other words, anyone doing anything new can define that they’re using next generation analytics, so it doesn’t mean much.

This report is better than a number of others I’ve seen from the TDWI for another thing early in it: The demographics. The positions of respondents seems to be far more balanced between IT and non-IT than others and it lends the report more credibility when discussing business intelligence that matters to business.

The first statistic of note from the webinar is when we were told that 40% of respondents already use advanced analytics. Let’s deal with the bigger number: 60% of respondents, supposedly the cream of the crop who respond to TDWI, are still using basic reporting. That clearly points to a slower adoption than many in the industry acknowledge.

A major part of the reason for that is an inability to build business messages for newer applications and therefore an inability for techies to get the dollars to purchase the new systems. I talk about that a lot in my blog, wonderful technology that gets only slowly adopted because technical messages don’t interest a business audience.

Then there was the slide titled “Dashboards are the most commonly used kind of analytics.” Dashboards aren’t analytics. Dashboards are a way to display analytics. However, as it takes a technology to create dashboards to hold analytics containers, technical folks think technically and get fuzzy. Many of the newer analytics tools, including many providing predictive analytics, embed the new analytics along with others inside dashboards.

One key slide, that might seem obvious but is very important, is about the areas where next generation analytics are being adopted. The top areas are, in order:

  • Marketing
  • Executive management
  • Sales
  • Finance

Two of the first three are focused on top line business, revenue, while the other two have to balance top and bottom line. Yes, operations matters and some of those areas aren’t far behind, but the numbers mean something I’ve repeated and will continue to repeat: Techies need to understand the pressures, needs and communications styles of folks they don’t often understand, and must create stories that address those people. If you take anything from the TDWI report, take that.

One pair of subjects in the talk made me shake my head.

First, the top three groups of people using advanced analytics in companies:

  • Business Analysts: 75%
  • Data Scientists: 56%
  • Business Users: 50%

We can see the weight of need overbalanced to business categories, not the mystically names and overpriced data scientist.

The report has a very good summary of how respondents are trying to overcome the challenges of adopting the new BI solutions (page 22, worth downloading the report). Gaining skills is the first part of that and some folks claim their way of doing it is to “Hir[e] fewer but more skilled personnel such as data analysts and data scientists.” Those are probably the folks in the middle bullet above, who think that a priesthood can solve the problem rather than the most likely solution of providing skills and education to the folks who need to use the information.

TDWI Webinar - Advanced Analytics - Challenges

Fern Halper was very clear about that, even though she’s a data scientist. She pointed out that while executives don’t need to learn how to build models they do need to understand what the new models need and how to use them. While I think that dismisses the capabilities of many executives, it does bring the information forward. The business analysts are going to work with management to create real models that address real world problems. Specialist statistical programmers might be needed for very complex issues, but most of those people will be hired by the BI vendors.

Q&A, to be honest, hit on a problem with TDWI webinars. There were a couple of business questions, but the overwhelming number of questions were clearly from students looking for study and career advice. That leads to a question about the demographics of the audience and how TDWI should handle future webinars. If they want the audience to be business people, they need to market it better and focus on business issues. That means from the reports all the way to Q&A. Yes, TDWI is about technologies that can help business, but adoption will remain slow while the focus is on the former and not the later.

Summary

The latest TWDI Best Practices Report has some interesting information to describe the slow adoption of advanced BI into the marketplace. It has some great nuggets to help vendors focus better on a business audience and suggest IT needs to also pay more attention to their users, and it’s more balanced than other recent reports. However, the presentation of the information still makes the same mistake as that made by many vendors – it’s not creating the clear, overarching business message needed to speed adoption.

If you missed the presentation, don’t both with watching on-demand. However, download the report, it’s worth the read.

Teleran at BBBT: Great technology, again with the message…

The BBBT started off 2015 with yet another company with a great technology and far too simplistic strategy and message. It’s the old problem: Lots of folks come to the BBBT because their small companies are starting to get traction and they want wider exposure, but the management doesn’t really understand Moore’s Chasm so are still pitching to their early adopters rather than the larger market.

Friday’s presenter was Kevin Courtney, VP Business Solutions, Teleran. Back in the day, I evaluated a small technology company for acquisition of their technology and inclusion into Mercury Interactive’s testing suite. It was an SQL inspector that let our products see the transactions going between clients and servers to help improve performance testing.

Teleran has the same basics but has come much further in recent years. The company starts with the same technology but has layered great analytics on top in order to help companies understand database usage in order to optimize application and network performance. They’ve broken down the issues into three key areas of business concern:

  • Performance and value: How are queries being performed in order to minimize dead data transfer and increase the value of existing computing infrastructure.
  • Risk and Compliance: Understanding who is doing what with data in order to minimize risk and prove regulatory and contract compliance.
  • Modernization, migration.re-platforming: Understanding existing loads, transactions and queries in order to better prepare for upgrading to new technologies – both hardware and software.

In support of these capabilities, Kevin mentioned that they have 8 software patents. While my understanding of patent, trademark and copyright laws leads me to understand that software patents shouldn’t be legal, they are and the patents do show innovation in the field. Hopefully.

Mr. Courtney also did a good job giving stories that supported each of these areas. I’ll quickly describe my two favorite (I know, three bullets, but that’s life).

One example showed that value is more than just a dollar value. He described a financial trading house using Teleran to analyze the different technology and data usage patterns between their top and bottom performing agents, then used that analysis to provide training to the bottom tranche (yes, I did have to use that word while discussing finance) in order to improve their performance.

The second example combined performance and modernization. He described a company where there were seven unsanctioned data marts pulling full data sets from operational systems. That had a severely negative impact on performance throughout their infrastructure. The understanding of those systems allowed for planning to consolidate, upgrade and modernize their business intelligence infrastructure.

So what’s the issue?

They have a great product suite, but what about strategy? The discussion, with additional information from Chris Doolittle, VP Marketing, via phone, is that they have a system that isn’t cheap and they readily admit they have trouble proving their own value.

Take a look at the Teleran site. Download some case studies. What you see is lots and lots of discussion about the technology. However, even when they do discuss some of the great stories they told us, the business value is still buried in the text. They’re still selling to IT and not providing IT the clear information needed to convince the business users to write the checks.

What’s needed is the typical chasm move of turning things upside down. They need to overhaul their message. They need to boldly lead with the business value and discuss how it’s provided only after describing that value.

What’s also needed is something that will be even harder: Changing the product in synchronization with the message. The demonstration showed a product that has little thought put into the user interface. At one point, Kevin said, after going through four different tables, “if you take x, y and z, then you can see that…” Well, that needs to be clear in a business intelligence driven interface rather than having the pieces scattered around requiring additional information or thought to figure it out. It’s overcrowded, very tabular and dashboards aren’t really dashboards. They need to contract with or hire some UI experts to rethink their interface.

They do OEM Qlik, but there are two problems with that. It looks like they’re using a very old version and aren’t taking advantage of Qlik’s modern BI toolsets. Also, the window with the information has a completely Qlik title. It should read Teleran’s product powered by Qlik in order to keep context.

Summary

Teleran is another company with a great technology that needs to change in order to cross the chasm. Their advantage is that their space, performance analysis, is far less crowded than the database or BI end points. If they can clarify their products and messages, they can carve out a very nice chunk of the market.

Denodo at BBBT: Data Virtualization, an Important Niche

Data virtualization. What is it? A few companies have picked up the term and run with it, including last week’s BBBT presenter Denodo. The presentation team was Suresh Chandrasekaran, Sr. VP, North America, Paul Moxon, Sr. Director, Product Management & Solution Architecture, and Pablo Alvarez, Sales Engineer. Still, what I’ve not seen is a clear definition of the phrase. The Denodo team did a good job describing their successes and some features that help that, but they do avoiding a clear definition.

Data Virtualization

The companies doing data virtualization are working to create a virtual data structure where the logical definitions link back to disparate live systems instead of overlaying a single aggregated database of information. It’s the concept of a federated data warehouse from the 1990s, extended past the warehouse and now more functional because of technology improvements.

Data virtualization (and note that, sadly, I don’t create an acronym because DV is also data visualization and who needs the confusion. So more typing…) is sometimes thought of as a way to avoid data warehouses by people who hear about it at a high level, but as the Denodo team repeatedly pointed out, that’s not the case. Virtualization can simplify and speed some types of analysis, but the need for aggregated data stores isn’t going away.

The biggest problem with virtualization for everything is operational systems not being able to handle the performance hits of lots of queries. A second is that operational systems don’t typically track historical information needed for business analysis. Another is that very static data in multiple systems that’s accessed frequently can create an unnecessary load on today’s busier and busier networks. Consolidating information can simplify and speed access. Another is that change management becomes a major issue, with changes to one small system potentially causing changes to many systems and reports. There are others, but they in no way undermine the value that is virtualization.

As Pablo Alvarez discussed, virtualization and a warehouse can work well together to help companies blend data of different latencies, with virtualization bringing in dynamic data to mesh with historic and dimensional information to provide the big picture.

Denodo

Denodo seems to have a very good product for virtualization. However, as I keep pointing out when listening to the smaller companies, they haven’t yet meshed their high level ideas about virtualization and their products into a clear message. The supposed marketechture slide presented by Suresh Chandrasekaran was very technical, not strategic. Where he really made a point was in discussing what makes a Denodo pitch successful.

Mr. Chandrasekaran states that pure business intelligence (BI) sales are a weak pitch for data virtualization and that a broader data need is where the value is seen by IT. That makes absolute sense as the blend between BI and real-time is just starting and BI tends to look at longer latency data. It’s the firms that are accessing a lot of disparate systems for all types of productivity and business analysis past the focus on BI who want to get to those disparate systems as easily as possible. That’s Denodo’s sweet spot.

While their high level message isn’t yet clarified or meshed with markets and products, their product marketing seems to be right on track. They’ve created a very nicely scaled product

Denodo Express is free version of their platform. Paul Moxon stated that it’s fully functional, but it can’t be clustered, has a limitation of result set size and can’t access certain data sources. However, it’s a great way for prospects to look at the functionality of the product and to build a proof-of-concept. The other great idea is that Denodo gives Express users a fixed time pricing offer for enterprise licensing. While not providing numbers, Suresh stated that the offer was working well as an incentive for the freeware to not be shelfware, for prospects to test and move down the sales funnel. To be blunt, I think that’s a great model.

One area they know is a weakness is in services, both professional services and support. That’s always an issue with a rapidly growing company and it’s good to see Denodo acknowledge that and talk about how they’re working to mitigate issues. The team said there are plans to expand their capital base next year, and I’d expect a chunk of that investment to go towards this area.

The final thing I’ll note specifically about Denodo’s presentation is their customer slides. That section had success stories presented by the customers, their own views. That was a strong way to show customer buy in but a weak way to show clear value. Each slide was very different, many were overly complex and most didn’t clearly show the value they achieved. It’s nice, but customer stories need to be better formalized.

Data Virtualization as a Market

As pointed out above, in the description of virtualization, it’s a very valuable tool. The market question is simple: Is that enough? There have been plenty of tools that eventually became part of a larger market or a feature in a larger product offering. What about data virtualization?

As the Denodo team seems to admit, data virtualization isn’t a market that can stand on its own. It must integrate with other data access, storage and provisioning systems to provide a whole to companies looking to better understand and manage their businesses. When there’s a new point solution, a tool, partnerships always work well early in the market. Denodo is doing a good job with partners to provide a robust solution to companies; but at some point bigger players don’t want to partner but to provide a complete solution.

That means data virtualization companies are going to need to spread into other areas or be acquired. Suresh Chandrasekaran thinks that data virtualization is now at the tipping point of acceptance. In my book, given how fast the software industry, in general, and data infrastructure markets, in particular, grow and evolve, that leaves a few years of very focused growth before the serious acquisitions happen – though I wouldn’t be surprised if it starts sooner. That means companies need to be looking both at near term details and long term changes to the industry.

When I asked about long term strategy, I got the typical startup answer: They’re focused on internal growth rather than acquisition (either direction). That’s a good external message because folks who want a leading edge company want it clear that they’re using a leading edge company, but I hope the internal conversations at the CxO level aren’t avoiding acquisition. That’s not a failure, just a different version of success.

Summary

Denodo is a strong technical company focused on data virtualization in the short run. They have a very nicely scaled model from Denodo Express to their full product. They seem to understand their sweet spot within IT organizations. Given that, any large organization looking to get better access to disparate sources of data should talk with Denodo as part of their evaluation process.

My only questions are in marketing messages and whether or not Denodo be able to change from a technical sales to a higher level, clearer vision that will help them cross the chasm. If not, I don’t think their product is going anywhere, someone will acquire them. Regardless, Denodo seems to be a strong choice to look at to address data access and integration issues.

Data virtualization is an important niche, the questions remain as to how large is the niche and how long it will remain independent.

Data Governance and Self-Service Business Intelligence: History Repeating?

Self-Service BI is a big buzz phrase these days even though many definitions exist. However, one thing is clear: It’s driving another challenge in the area of data governance. While people are starting to talk about this, it’s important to leverage what we’ve learned from the past. Too many technology industry folks are so enamored by the latest piece of software or hardware that they convince themselves their solutions are so new they are revolutionary, “have no competitors” or otherwise rationalize context. However, the smart people won’t do that.

A Quick History Lesson: The PC

In 1982, I was an operator at one of Tymshare’s big iron floors. It was a Sunday and I was reading my paper sitting at the console of an IBM 370/3033, their top of the line business computer. An article in the business section was an article titled something like “IBM announces their 370 on a chip.” I looked up at my behemoth, looked back at the article and new things would change.

Along came the PC. Corporate divisions and departments frustrated at not getting enough resources from the always understaffed, under financed and overburdened IT staff jumped on the craze. Out with the IBM Selectric and in with the IBM AT and its successors and clones.

However, by the end of the decade and early in the 1990s, corporate executives realized they had a problem. While it was great that each office was becoming more productive, the results weren’t as helpful. It’s a lot harder to roll-up divisional sales data when each territory has a slightly different definition of their territories, lead and funnels. It’s hard to make manufacturing budget forecasts when inventory is stored in different formats and might use different aging criteria. It’s hard to show a government agency you’re in regulatory compliance when the data in in multiple and non-integrated systems.

Data governance had been lost. The next twenty years saw the growth of client server software such as that by Oracle and SAP, working to link all offices to the same data structures and metadata while still working to leave enough independence. That balance between centralized IT control and decentralized freedom of action is still being worked out but is necessary.

While the phrase “single version of truth” is often mistakenly applied to mean a data warehouse and a “single source of truth,” that’s not what it means. A single version of the truth means shared data and metadata that ensures that all parties looking at the same data come up with the same information – if not the same conclusions from that information.

Now: Self-Service BI

Look at the history of the BI market. There have always been reports. With the advent of the PC, we had the de facto standard of Crystal Reports for a generation. Then, as the growth of packaged ERP, CRM, SFA and other systems came along, so did companies such as Cognos and Business Objects to focus on more complex analysis. However, they were still bound by the client/server model that was tied primarily to mid-tier Unix servers and Microsoft/Apple PCs.

What’s changed now are the evolution of the internet into the Cloud and phones into smartphones and tablets. Where divisions and departments were once leashed to big iron and CICS screens, divisions who have been more recently tied to desktops are feeling their oats and interested in quickly developing their own applications that allow their knowledge workers to access information while not seated in the office.

Self-Service BI (And, no, I’m not going to make an acronym as many have. Don’t we have enough?) is the PC of this decade. It’s letting organizations get information to people without waiting for IT, who’s still underfunded, understaffed and overburdened, distribute information widely. Alas, that wide distribution comes without controls and without audit trails. Data governance is again being challenged.

I’ve listened to a number of presentations by vendors to the BBBT, and there is hope. Gone are the days when all BI companies talked about was in helping business people avoid using IT. There’s more talk about metadata, more interest in security and access control, and a better ability to provide audit trails. There’s an understanding that it’s great to allow every knowledge worker to look at the data and understand those pieces of information arising that address their needs while still ensuring that the base data is consistent and metadata is shared.

Summary

We can learn from history. The PC was a great experiment in watching the pendulum swing from almost complete IT control to almost no IT control then back to a more reasonable middle. The BI community shows signs of learning from history and making a much faster switch to the middle ground. That’s a great thing.

Technologists working to help businesses improve performance through data, BI and analytics need to remember the great quote from Daniel Patrick Moynihan, “Everyone is entitled to his own opinion, but not his own facts.”

Tableau Software Analyst Briefing: Mid-size BI success and focus on the future

Yesterday, Tableau Software held an analyst briefing. It wasn’t a high level one, it was really just a webinar where they covered some product futures under NDA. However, it was very unclear what was NDA and what wasn’t. When they discussed things announced at the most recent Tableau Conference in Seattle, that’s not NDA, but there was plenty of future discussed, so I’ll walk a fine line.

The first news is to cover their Third Quarter announcement from the beginning of the month. This was Tableau’s first quarter of over $100 million in recognized revenue. It’s a strong showing and they’re justifiable proud of their consistent growth.

Ajay Chandrdamouly, Analyst Relations, also said that the growth primarily results from a Land and Expand strategy, beginning with small jobs in departments or divisions, driven by business needs, then expanding into other organizations and eventually into a corporate IT account position. However, one interesting point is an expansion mentioned later in the presentation by Francois Ajenstat, Product Management, while giving the usual case studies seen in such presentations. He did a good job of showing one case study that was Land and Expand, but another began as a corporate IT account and usage was driven outward by that. It’s an indication of the maturity of both Tableau and the business intelligence (BI) market that more and more BI initiatives are being driven by IT at the start.

Francois’ main presentation was about releases, past and future. While I can’t write about the later, I’ll mention one concern based on the former. He was very proud about the large number of frequent updates Tableau has released. That’s ok in the Cloud, where releases are quickly rolled into the product that everyone uses. However, that’s a risk in on-premises (yes, Francois, the final S is needed) installations in the area of support. How long do you support products and how do you support them is an issue. Your support team has to know a large number of variations to provide quick results or must investigate and study each time, slowing responses and possibly angering customers. I asked about the product lifecycle and how they managed to support and to decide sunsetting issues, but I did not get a clear and useful answer.

The presentation Mr. Ajenstat gave listed six major focus themes for Tableau, and that’s worth mentioning here:

  • Seamless Access to Data
  • Analytics & Statistics for Everyone
  • Visual Analytics Everywhere
  • Storytelling
  • Enterprise
  • Fast, Easy, Beautiful

None of those is a surprise, nor is the fact that they’re trying to build a consistent whole from the combination of foci. The fun was the NDA preview of how they’re working on all of those in the next release. One bit of foreshadowing, they are looking at some issues that won’t minimize enterprise products but will be aimed at a non-enterprise audience. They’ll have to be careful how they balance the two but expansion done right brings a wider audience so can be a good thing.

The final presenter was Ellie Fields, Product Marketing, who talked more about solution than product. Tableau Drive is not something to do with storage or big data, it’s a poorly named but well thought out methodology for BI projects. Industry firms are finally admitting they need some consistency in implementation and so are providing best practices to their implementation partners and customers to improve success rates, speed implementation and save costs. Modern software is complex, as are business issues, so BI firms have to provide a combination of products and services that help in the real world. Tableau Drive is a new attempt by the company to do just that. There’s also no surprise that it uses the word agile, since that’s the current buzzword for iterative development that’s been going on long before the word was applied. As I’m not one who’s implemented BI product, I won’t speak to its effectiveness, but Drive is a necessity in the marketplace and Tableau Drive helps provide a complete solution.

Summary

The briefing was a technical analyst presentation by Tableau about the current state of the company and some of its futures. There was nothing special, no stunning revelations, but that’s not a problem. The team’s message is that the company has been growing steadily and well and that their plans for the future are set forward to continue that growth. They are now a mid-size company, no longer as nimble as startups yet don’t have the weight of the really large firms, they have to chart a careful path to continue their success. So far it seems they are doing so.

Magnitude/Kalido Webinar Review: Automated and Agile, New Principles for Data Warehousing

I watched a webinar yesterday. It was sponsored by Magnitude, the company that is the result of combining Kalido and Noetix. The speakers were Ralph Hughes, a data warehousing consultant operating as Ceregenics, and John Evans of Magnitude.

Ralph Hughes’ portion of the presentation was very interesting in a great way. Rather than talking about the generalities of enterprise data warehouses (EDW) and Agile, he was the rare presenter who discussed things clearly and in enough detail for coherent thought. It was refreshing to hear more than the usually tap dance.

Webinar - Magnitude - Ceregenics slide

Ralph’s slide on the advantages of agile development for EDW’s is simple and clear. The point is that you don’t know everything when you first acquire requirements and then design a system. In the waterfall approach, much of coding, testing and usage is wasted time as you find out you need to do extra work for new requirements that pop up. Agile allows business users to quickly see concepts and rethink their approaches, saving both time to some productivity and overall time and effort of development.

After talking about agile for a bit, he pointed out that it does save some time but still leaves lots of basic work to do. He then shifted to discuss Kalido as a way to automate some of the EDW development tasks in order to save even more time. He used more of his presentation to describe how he’s used the tool at clients to speed up creation of data warehouses.

One thing he did better in voice than on slides was to point out that automation in EDW doesn’t mean replacing IT staff. Rather, appropriately used, it allows developers to move past the repetitive tasks and focus on working with the business users to ensure that key data is encapsulated into the EDW so business intelligence can be done. Another key area he said automation can’t do well is to manage derived tables. That still requires developers to extract information, create the processes for creating the tables, then moving the tables back into the EDW to, again, enhance BI.

Notice that while Mr. Hughes spoke to the specifics of creating EDWs, he always presented them in context of getting information out. Many technical folks spend too much time focused on what it takes to build the EDW, not why it’s being build. His reminders were again key.

John Evans’ presentation was brief, as I always like to see from the vendors, rounding out what his guest speaker said. He had three main points.

First, the three main issues facing IT in the market are: Time to value, time to respond to change and total cost of ownership. No surprise, he discussed how Magnitude can address those.

Second, within his architecture slide, he focused on metadata and what he said was strong support for master data and metadata management. Given the brief time allotted, it was allusion to the strengths, but the fact that he spoke to it was a good statement of the company’s interests.

Third, he discussed the typical customer stories and how much time the products saved.

Summary

The webinar was very good exposure to concepts for an audience thinking about how to move forward in data warehousing, whether to build EDWs or maintain them. How agile development and an automation tool can help IT better focus on business issues and more quickly provide business benefit was a story told well.

Revolution Analytics at BBBT: Vision and products for R need to mesh

Revolution Analytics presented to the BBBT last Friday. The company is focused on R with a stated corporate vision of “R: The De-facto standard for enterprise predictive analytics .” Bill Jacobs, VP, Product Marketing, did most of the talking while Steve Belcher, Sales Engineer, gave a presentation.

For those of you unfamiliar with R as anything other than a letter smack between Q and S, R is an open source programming language for statistics and analytics. The Wikipedia article on R points out it’s a combination of Scheme and S. As someone who programmed in Scheme many years ago, the code fragments I saw didn’t look like it but I did smile at the evolution. At the same time, the first thing I said when I saw Revolution’s interactive development environment (IDE) was that it reminded me of EMACS, only slightly more advanced in thirty years. The same wiki page referenced earlier also said that R is a GNU project, so now I know why.

Bill Jacobs was yet another vendor presenter who has mentioned his company realized that the growth of the Internet of Things (IOT) means a data explosion that leaves what is currently misnamed as big data in the dust as far as data volumes. He says Revolution wants to ensure that companies are able to effectively analyze IOT and other information and that his company’s R is the way to do so.

Revolution Analytics is following in the footsteps of many companies which have commercialized freeware over the years, including Sun with Unix and Red Hat with Linux. Open source software has some advantages, but corporate IT and business users require services including support, maintenance, training and more. Companies which can address those needs can build a strong business and Revolution is trying to do so with R.

GUI As Indicative Of Other Issues

I mentioned the GUI earlier. It is very simple and still aimed at very technical users, people doing heavy programming and who understand detailed statistics. I asked why and was told that they felt that was their audience. However, Bill had earlier talked about analytics moving forward from the data priests to business analysts and end users. That’s a dichotomy. The expressed movement is a reason for their vision and mission, but their product doesn’t seem to support that mission.

Even worse was the response when I pointed out that I’d worked on the Apple Macintosh before and after MPW was released and had worked at Gupta when it was the first 4GL language on the Windows platform. I received as long winded answer as to why going to a better and easier to use GUI wasn’t in the plans. Then Mr. Jacobs mentioned something to the effect of “You mentioned companies earlier and they don’t exist anymore.” Well, let’s forget for a minute that Gupta began a market, others such as Powersoft did well too for years, and then Microsoft came out with its Visual products to control the market but that there were many good years for other firms and the products are still there. Let’s focus on wondering when Apple ceased to exist.

It’s one thing to talk about a bigger market message in the higher points of a business presentation. It’s another, very different, thing to ensure that your vision runs through the entire company and product offering.

Along with the Vision mentioned above, Revolution Analytics presents a corporate mission to “Drive enterprise adoption of R by providing enhanced R products tailored to meet enterprise challenges.” Enterprise adoption will be hindered until the products reflect an ability to work for more than specialist programmers but can address a wider enterprise audience.

Part of the problem seems to be shown in the graphic below.

Revolution Analytics tech view of today

Revolution deserves credit for accurately representing the current BI space in snapshot. The problem is that it is a snapshot of today and there wasn’t an indication that the company understands how rapidly things change. Five to ten years ago, the middle column was the left column. Even today there’s a very technical need for the people who link the data to those products in order to begin analysis. In the same way, much of what is in the right column was in the middle. In only a few years, the left column will be in the middle and the middle will be on the right.

Software evolves rapidly, far more rapidly that physical manufacturing industries. Again, in order to address their enterprise mission, Revolution Analytics’ management is going to have to address what’s needed to move towards the right columns that mean an enterprise adoption.

Enterprise Scalability: A Good Start

One thing they’ve done very well is to build out the product suite to attract different sized businesses, individual departments and others with a scaled product suite to attract a wider audience.

Revolution Analytics product suite

Revolution Analytics product suite

They seem to have done a good job of providing a layered approach from free use of open source to enterprise weight support. Any interested person should talk with them about the full details.

Summary

R is a very useful analytical tool and Revolution Analytics is working hard to provide business with the ability to use R in ways that help leverage the technology. They’re working hard to support groups who want pure free open source and others who want true enterprise support in the way other open source companies have succeeded in previous decades.

Their tool does seem powerful, but it is still clearly and admittedly targeted at the very technical user, the data priests.

Revolution Analytics seems to have a start to a good corporate mission and I think they know where they want to end up. The problems is that they haven’t yet created a strategy that will get them to meet their vision and mission.

If you are interested in using R to perform complex analysis, you need to talk to Revolution Analytics. They are strong in the present. Just be aware that you will have to help nudge them into the future.

The Market Positioning Document: Consistency Helps the Message

I’m regularly commenting on how companies are communicating their benefits. One thing I see seems often to be a scatter-shot approach to marketing. Some ideas are semi-formalized and then thrown out to whatever channels are handy. That usually doesn’t work.

You may have heard the term Integrated Marketing. It’s the idea that all your marketing messages should be consistent throughout all your communications channels.

Integrated marketing means more than marketing informally telling different people similar messages. It means formalizing your marketing message, distilling it to the core so that each channel group can work off of the same message to customize it appropriately for each channel. That’s where the positioning document comes in.

The Product Marketing Manager (PMM) is usually the owner for the message for each product, while corporate marketing can have a separate positioning document for the business. As I’m talking about how to better focus on marketing products, I’ll be referring to the PMM. Sadly, there’s not enough understanding of the need for a PMM in many small companies, so that’s one reason the messaging tends not to solidify, but this article will refer to the position.

The market positioning document should be a tool for consistency in all channels. That means it’s an internal tool as long as “internal” also means resellers or any partner who customizes messaging.

The Positioning Document

A positioning document for enterprise software solutions needs to address the following key issues:

  • What is it: If you can’t describe the product simply, you don’t understand it.
  • Why does it matter: What technical problems are you solving for the market?
  • What’s the value: How does that solution benefit the business and the stakeholders?
  • Target Audience: Speaking of stakeholders, who are they?
  • Competition: What issues matter in the competitive landscape and how are they addressed?

While all the issues matter, it’s the middle one that, like any deli sandwich, is the meet. What you have and what the market wants meets at your value. Part of that value section can be the elevator pitch, but it has to make clear why it is somebody wants to write you a check.

There are a number of ways of creating the positioning documents, so there’s no single template to define. What I’ve seen are two typical directions depending on the size and focus of the company.

Startups and early stage companies are typically focused on their technology. They’ve built something they think is needed and that a couple of prospects, typically known personally by founders, think they want. They need to formalize their market positioning, so they should start with what they have. The ordered list of bullets above area a good flow for companies in the early stage to clarify what they have and then figure out the larger market that can boost sales.

However, mid-size and larger companies should have things turned around. They should have changed from finding a market for cool technology to building technology for a market they understand. That means starting with the target audience. What are their pain points? What about their business needs help. Then look at those and understand where you can add value. From there, adjust your products or create new ones. The positioning document should help define products, rather than describing them.

One critical item that should run throughout the positioning document though not mentioned explicitly is the simple fact that product marketing isn’t in a void. PMMs are part of a larger corporation. Do not create a positioning document within the product marketing group but ensure that messaging matches corporate strategy. While that might sound obvious, I’ve seen examples of different PMMs creating positioning documents that contradict each other because of a product focus that doesn’t take into effect corporate needs.

Document Usage

The PMM controls the product messaging for public relations, advertising, analyst briefings and more. To be involved in all of those tasks to a detailed level is a huge strain upon a busy job. If the positioning document can be for basic boilerplate, it can save the PMM time. Whether the corporate marketing team extracts information to combine with other information, then runs it by the PMM or the PMM uses it as a basis for multiple documents to quickly hand off to the team, everyone’s job is made easier and more effective.

An oft overlooked, use of the document is SEO/SEM. Keywords matter. Trying to sit and think those up in a void is often an experiment in randomness. However, if you can distill what you’re doing into a core value statement, your keywords arise naturally. Depending on the search engine and the campaign, terms for the specific target audience can help raise results, as can understanding competitive positioning. The SEO/SEM team can work with the positioning document test keywords and bring them back to the PMM for analysis and refinement.

Don’t forget channel partners. While smaller partners can directly access your collateral or simply add their logo and contact, larger partners have their own processes, standards and templates. The positioning document can provide consistency across organizations, and even more important task than within your own organization.

A final example for all PMM’s can be summed up on one word: Re-use. The PMM is the source of the product message and has to keep abreast of all the places where your products are mentioned. If you can clarify and distill your message into a single document, you not only help the company but yourself as well. You’re no longer remembering the multiple documents where pieces exist or managing a large folder of examples. You have the document. You can boilerplate lots of information. When you distribute it and the rest of the marketing or sales team calls with questions, you can have them refer to the standard usage and messages, then more quickly help them adjust the messages to any unique environment.

Conclusion

The market positioning document should be a key tool used by every product marketing manager. It will help you focus your product message and then improve the effectiveness of working with others to customize that message for each channel of distribution. Good use of product positioning documents can create powerful messages that repeatedly address the key needs of the market across all channels, providing a consistent front that helps your company show value to prospects.

MicroStrategy at BBBT: A BI Giant Working to Become More Agile

Last Friday’s BBBT presentation was by Stefan Schmitz, VP Product Management, MicroStrategy. This will be a short post because a lot of the presentation was NDA. Look to MicroStrategy World in January for information on the things discussed.

The Company

The primary purpose of the public portion of the presentation was to discuss the reorganization and refocus of MicroStrategy. Stefan admitted that MicroStrategy has always been weak on marketing and that in recent years Michael Saylor has been focused on other issues. Mr. Schmitz says those things are changing, Saylor is back and they’re focusing on getting their message out. In case you’re wondering why a company that claims to be pushing marketing showed up with only a product management guy, they’d planned on also having a product marketing person but life intervened. Stefan’s message clearly had strong marketing input and preparation so I believe the focused message.

When we discussed the current market, Paul te Braak, another BBBT member, asked a specific question about where MicroStrategy saw self-service analytics. Stefan responded, accurately, it was self-service for analysts only and systems are too simple and miss real data access.

One key point was the company’s view of the market as shown below.

MicroStrategy market view

The problem I have is that data governance isn’t there. It’s in some of the lower level details presented later, but that’s not strong enough. The blend of user empowerment and corporate governance requirements won’t be met until the later is perceived as a top criticality by technical folks. MicroStrategy is a company coming from the early days of enterprise business intelligence and I’d expect them to emphasize data governance the way a few other recent presenters have done, and the lack of that priority is worrisome.

The Technology

On the technology side, there were two key issues mentioned in the open session.

The first was a simplification of the product line. As Mr. Schmitz pointed out, they had 21 different products and that caused confusion in the market, slowing sales cycles and creating other problems. MicroStrategy has simplified its product structure to four products: The server, the architect for developing reports and dashboards, and separate products for Web and mobile delivery.

The second is an AWS cloud implementation along with data warehousing partners in order to provide those products as part of a scalable and complete package. This is aimed at helping the company move downstream to address departmental, divisional and smaller company needs better than their history of mainstream IT sales has allowed.

This is still evolving and the company can give more information, as you’d expect.

More was mentioned but, again, it was under NDA.

Summary

MicroStrategy is an established BI vendor, one of the older group of companies and, unlike Cognos and Business Objects, still standing on its own. The management knows that the newer vendors have started with smaller installations and are moving towards enterprise accounts. It is making changes in order to go the other direction. The company wants to expand from the core enterprise space into the newer and more agile areas of BI. Their plans seem strong, we’ll have to watch how they implement those plans.