Music has always been an integral part of human culture. From ancient times to modern-day, music has evolved and transformed in various ways. With the advent of technology, music creation and discovery have taken a wild and increasingly influential new turn. Enter Generative AI – For those living under a rock, Artificial Intelligence (AI) is a technology that has revolutionized the way we do almost everything, including how we create and discover music.
@OpenAI #Jukebox (https://www.openai.com/research/jukebox)is a prime example of how AI is bridging the gap between music and future technology. #OpenAI-Jukebox “produces a wide range of music and singing styles, and generalizes to lyrics not seen during training. All the lyrics below have been co-written by a language model and OpenAI researchers” that can generate original songs and tracks in various genres and styles similar to creativity powerhouse known simply as DALL-E for image creation. It uses deep learning algorithms to analyze existing songs’ patterns and structures to create new ones.
So does the future of music creation and discovery lie in generative AI tools like OpenAI Jukebox or design and art creation in tools like DALL-E? Either way, it’s the season of all things #OpenAI. These days you can’t escape a chatGPT meme or SNL skit powered by the little chatterbox, providing endless possibilities and countless hours of entertainment for experimentation with different words, phrases, images, sounds, styles, and genres. And did I mention they can also help you to work smarter not harder with e-mail responses and documentation creation or even building apps for you ( ie – code scripting )?
Generative AI tools like OpenAI Jukebox are not only limited to creating new songs but also have the ability to remix existing ones. This opens up a whole new world of possibilities for artists who want to experiment with their work or collaborate with other musicians.
The use of generative AI tools in music creation also raises questions about copyright laws and ownership rights. As these tools become more advanced, it will be interesting to see how they impact traditional copyright laws.
As a former EDM DJ, Im excited to see where OpenAI takes its Jukebox research – it is just one example of how AI can revolutionize the world of music creation and discovery IMHO. As technology continues to evolve at this crazy rapid pace, it’s exciting to think about what other possibilities lie ahead for the world of AI and ( fill in here ). The future looks bright for humans and musicians and artists and fans alike as we march on this yellow brick journey towards what ? Emerald city or a more innovative musical landscape powered by artificial intelligence? Who knows ? Let’s ask ChatGPT!
Technology Trends
Get the Bing app with the new AI-powered Bing now!:
https://bingapp.microsoft.com/bing?adjust=gfv2crx_717ll9t&style=sydney
Explaining #Containers / #Kubernetes to a Child : How To Become a Storytelling Steward Using Gamification & Graphic Novels (Comics)
How To Become a Storytelling Steward Using Gamification & Graphic Novels (Comics)
If someone asked you to explain the benefits of #Containers / #Kubernetes as though you were speaking to a child, do it in under 2 minutes & guarantee said kiddo’s comprehension, could you do it?
I was tasked recently with doing the same thing by my COO using Machine Learning as the main talking point . To elucidate, I decided to pose the same question to several peers who graciously entertained this whim ‘o mine. While they provided technically correct explanations’, often, they parked their response somewhere between boredom-block and theoretical thoroughfare. Yawn!
What those very intelligent practitioners failed to remember was this was NOT the latest round of stump-the-chump; that the goal was to explain Machine Learning in a way that a child could grok without “adult-splaining” / “grownup-eze” or other explanatory methods – Add to this, the goal of keeping the child engaged for the full 2 minutes –> well, shoot, ++ to you, dear reader turned supreme storytelling savant if you made that happen. While we are at it, why not add the ability to explain ML while avoiding the dreaded “eye gloss over” affect that most listeners dawn when tuning out their brain. This ‘Charlie Brown’ <whoah whoah whoah> adult vernacular riposte is nearly always reflected back to the speaker via those truth telling eyes of yours, enabling the Edgar Allen Poe in us all. Huh? What I mean is the tell-tale Pavlovian heart response to any “Data Science-based” summarization, in my experience.
Instead, I described two scenarios involving exotic fruit –> the 1st, included the name for each of those curious fruits aka labels which was the basis for her being able to label them on demand accordingly. The 2nd scenario, also involved exotic fruits BUT the difference was that she was NOT provided any names ahead of time yet still was tasked with naming said items – And for those data scientists reading this, naturally, were metaphors for supervised and unsupervised learning.
Originally, I had prepared a similar talk for a ML centric presentation I was set to give to a community based data science event (later shared on this blog) – It contains about 90% comics / image iconography instead of laborious text per slide & was received incredibly well. In fact, when delivered to a 200+ audience, it was met with much applause and higher than normal attendee survey satisfaction scores. Simplicity & pacing; Remember, an image speaks 1000 words and to a child who often learns experientially / visually, well, it becomes the storytellers handbook to the hive mind of children everywhere :).
By the way, you can stop me next time when I diverge so sharply from the path.
But now we are back –> putting to bed my thorough digression & pulling you back to my 1st sentence above:
If someone asked you to explain the benefits of #Containers / #Kubernetes as though you were speaking to a child, do it in under 2 minutes & guarantee said kiddo’s comprehension, could you do it?
This awesome comic/bedtime story is one way to answer with a resounding YES ! Meet Phippy & Zee and follow their adventures as they head off to the Zoo: Phippy Goes To The Zoo_A_Kubernetes_Story: https://azure.microsoft.com/en-us/resources/phippy-goes-to-the-zoo/en-us/
DevOps for Data Science – Part 1: ML Containers Becoming an Ops Friendly Citizen
So, for the longest time, I was in the typical Data Scientist mind space (or at least, what I personally thought was ‘typical’) when it came to CI/CD for my data science project implementations –> DevOps was for engineering / app dev projects; or lift and shift/Infrastructure projects. Not for the work I did – right? Dev vs. IT (DevOps) seemed to battle on while we Data Scientists quietly pursued or end results outside of this traditional argument:
At the same time, the rise & subsequent domination of K8 (Kubernetes) and Docker et al <containerization> was the 1st introduction I had to DevOps for Data Science. And, in the beginning, I didn’t get it, if I am being honest. Frankly, I didn’t really get the allure of containers <that was my ignorance>. When most data scientists start working, they realize that the majority of data science work involve getting data into the format needed for the model to use. Even beyond that, the model being developed will need to be operationalized as part of some type of web/mobile/custom application for the end user.
Now most of us data scientists have the minimum required / viable processes to handle things like versioning / source control et al. Most of us have our model versions controlled on Git. But is that enough?
It was during an Image Recognition workshop that I was running for a customer that required several specific image pre-processing & deep learning libraries in order to effectively script out an end to end / complete image recognition + object detection solution – In the end, it was scripted using Keras on Tensorflow (on Azure) using the CoCo Stuff 2018 dataset + YOLO real-time object detection that I augmented with additional images/labels specific to my use case & industry (aka ‘Active Learning’):
- Active Learning Workflow
Active Learning is an example of semi-supervised learning in which an algorithm interactively asks for more labeled data in order to affect model performance positively.
Labeling is often rushed because it doesn’t carry the cache of other steps in the typical data science workflow – And getting the data preprocessed (in this case , Images and Labels) is a necessary evil if you want to achieve better model performance in terms of accuracy, precision, recall, F1 – whichever, given a specific algorithm in play & its associated model evaluation metric(s) :
And it was during the setup/installation of these libraries when it occurred to me so clearly the benefit of Data Science containerization – If you have always scripted locally or on a VM, you will understand the pain of maintaining library / package versions whether using python or R or Julia or whatever the language du jour you use to script your model parameters / methods etc.
And when version conflicts come into play, you know how much time gets wasted searching / Googling / Stack Overflowing a solution for a resolution (ooh, those version dependency error messages are my FAVE <not really, sigh…but I digress>…
Even when you use Anaconda or miniconda “conda” for environment management, you are cooking with gas until you are not: like when your project requirements demand you pip/conda install very rare/specific libraries/packages that have other pkg version dependencies / prerequisites only to hit an error during the last package install step advising that some other upstream pkg version that is required is incorrect / outdated thus causing your whole install to roll back. Fun times <and this is why Cloud infrastructure experts exist>; but it takes away from what Data Scientists are chartered with doing when working on a ML/DL project. <Sad but true: Most Data Scientists will understand / commiserate what I am describing as a necessary evil in today’s day and age.>
OK and now we are back: Enter containers – how simple is it to have a Dockerfile (for example) which contains all the commands a user could call via the CLI to assemble an image including all of the packages/libraries and their dependencies by version for a set python kernel <2 or 3> and version (2.6, 2.7, 3.4, 3.5, 3.6 etc ) for this specific project that I described above? Technically speaking, Docker can build images automatically by reading the instructions from this Dockerfile. Further, using docker, build users can create an automated build that executes several command-line instructions in succession. –> Right there, DevOps comes clearly into picture where the benefits of environment management (for starters) and the subsequent time savings / headache avoidance becomes greater than the learning curve for this potentially new concept.
There are some other points to note to make this happen in the real world: Something like VSTS would need to be wrapped into a Docker Image, which would then be put on a Docker container registry on a cloud provider like Azure. Once on the registry, it would be orchestrated using Kubernetes.
Right about now, your mind is wanting to completely shut down. Most data scientists know how to provide a CSV file with predictions / or a scoring web service centered on image recognition/ classification handed off to a member of your AppDev team to integrate / code into an existing app.
However, what about versioning / controlling the model version ? Each time you hyper tune parameters within a model you are potentially changing the model performance – How do you know which set of ‘tunes’ resulted in the highest evaluation post scoring? I think about this all of the time because even if you save your changes in distinct notebooks (using JupyterHub et al), you have to be very prescriptive on your naming conventions to reflect the changes made to compare side by side across all changes during each tuning session you conduct.
This doesn’t even take into account once you pick the best performing model, actually implementing version control for the model that has been operationalized in production and the subsequent code changes required to consume it via some business app/process. How does the typical end user interact with the operationalized scoring system once introduced to them via the app? How will it scale!? All this would involve confidence testing, checking against a set threshold, and triggering some type of closed loop action system when anomalies are detected. Plus, how do you get sign off from different parties and orchestration between different cloud & on-premise servers that support the business process (with all the corporate firewall / networking / data movement / storage / encryption requirements & rules)? Maybe you have others to think about this – But if you want to be a data scientist worth having the overly used moniker applied to your role, you should care enough to learn about DevOps and how you can be a better corporate citizen & not just the Rockstar Data Scientist who alienates everyone to get to the root cause. IMHO:
This should be part of your Data Scientist process. Period. Hard stop. Not only for you, but for others that come after you or are on your team. No need to reinvent the wheel-Plus, for organizations that have strict CI/CD / DevOps procedures and limited Ops staff, the automation that you can bring with your project deliverables will win you favor, at a minimum, for considering this vital aspect to all other appDev type projects / roles in your company.
Integrating Databricks with Azure DW, Cosmos DB & Azure SQL (part 1 of 2)
I tweeted a data flow earlier today that walks through an end-to-end ML scenario using the new Databricks on Azure service (currently in preview). It also includes the orchestration pattern for ETL (populating tables, transforming data, loading into Azure DW etc), as well as the SparkML model creation stored on CosmosDB along with the recommendations output. Here is a refresher:
Some nuances that are really helpful to understand: Reading data in as CSV but writing results as parquet. This parquet file is then the input for populating a SQL DB table as well as the normalized DIM table in SQL DW both by the same name.
Selecting the latest Databricks on Azure version (4.0 version as of 2/10/18).
Using #ADLS (DataLake Storage , my pref) &/or blob.
Azure #ADFv2 (Data Factory v2) makes it incredibly easy to orchestrate the data movement from 3rd party clouds like S3 or on-premise data sources in a hybrid scenario to Azure with the scheduling / tumbling one needs for effective data pipelines in the cloud.
I love how easy it is to connect BI tools as well. Power BI Desktop can connect to any ODBC data source and specifically to your Databricks clusters by using the Databricks ODBC driver. Power BI Service is a fully managed web application running in Azure. As of November 2017, it only supports Spark running on HDInsight. However, you can create a report using Power BI Desktop and upload it to an Azure service.
The next post will cover using @databricks on @Azure with #Event Hubs !
Week 3 – @NFLFantasy PPR Play/Bench Using #MachineLearning
Recap from Week 3 (sorry – I really am trying to post before Thursday night but it seems that between work right now and updating my model stats mid-week, I just run out of time).
Week 3 was wildly successful. NFL.com was closer this time in terms of predicting my win over my opponent but nowhere near to the results that I achieved. I will always stand by Russell Wilson – what kind of Seahawk would I be if I threw in the towel and in my 2nd league (Standard format), he did not fail! He was simply divine. But alas, he is not my primary league QB (Tom Brady is – a hard pill to swallow personally being a die hard Seahawks fan after what happened in a certain very important yesteryear game – but he has proven his PPR fantasy value in Week 3). Primary League Week 3 – Wins = 3 / Losses = 0 (remember, after draft day, I was projected to end the season with an 8-8 W/L ratio. So, this might be the week; maybe not).
But last week, I genuinely felt bad – Locheness Jabberwokies, my week 3 opponent, happens to also be my man. And, this annihilation just felt like a win that went one step over the line of fairness. I mean a win’s a win – but this kind of decimation belongs outside of one’s relationship. Trust me. But he was a good sport. Except, he will no longer listen to my neurotic banter about losing in any given week, even if all signs point to a loss. Somehow, when I trust my model, it all works out. Now, I can’t predict injuries mid game like what happened in Week 4 to Ty Montgomery (my League 3 Flex position player). Standard league wise, he brought home 2.3 points ~ projected to earn about 10.70 Standard points with a st. deviation of +/- 1.5. But this was my lineup for Week 3 across my 3 leagues:
League #1 (Primary PPR) – remember, I aim to not just win but also optimize my lineup.
A bench full of points is a fail to me. But in this case, I benched Jordan Reed and picked up whomever was the next available TE off the waiver wire (granted he definitely contributed nothing). But out of my WR1 and WR2 + WR Flex, those I played were the best options (even though Mike Evans came in about 1.10 points less than Adam Thielen (bench), it was within the expected standard deviation, so either one would have been fine if played).
My RB situation has always been the bane of my league this year starting with my draft choices – Nothing to write home about except seeing the early value of Kareem Hunt (TG), even when NFL.com continued to project very little in his court.
Terrance West was supposed to be double digits but my model said to bench him vs. either Mike Gillislee or Kerwynn Williams. Both scored very little and essentially were within their own standard deviation negating their slight point difference.
All in all, players played worked out well and yes, though many stellar performances carried those that failed might be outliers in some regard (or at least they won’t bring home that many points week over week). But the PPR space is my golden circle of happiness – after all, I built my original algorithm using PPR league play / bench + historical point spreads + my secret sauce nearly 5 years ago; and those years of learning have “taught” the model (and me) many nuances otherwise missed by others in the sports ML space (though I respect greatly what my fellow ML “sportstaticians” put forth, my approach is very different from what I glean from others’ work).
One day, I would love to have a league with only ML Sports folks; the great battle of the algorithmic approaches – if you are interested, let me know in the comments.
League 2 (Standard): Wins = 3/Losses = 0:
As you can see, I should have played DeSean Jackson over Adam Thielen or my Flex position Ty Montgomery. And geez, I totally spaced on pulling Jordan Reed like I did in League 1. This win was largely because of Russell Wilson, as mentioned before, Devonta Freeman and the Defense waiver wire pick up of the Bengals who Im glad I picked up in time for the game. oh yeah, I am not sure why Cairo Santos shows as BYE but earned me 6 points??? NFL.com has some weird stuff happening around 12:30 last Sunday ; games showed as in play (even though kick off wasn’t for another 30 minutes); and those that showed in play erroneously allowed players to be added from the wire still as though the games weren’t kicked off. Anyway, not as proud but still another win – Year 1 for Standard; perhaps after another 5 years training Standard like my PPR league, I will have more predictable outcomes , other than luck.

#NFL.com, #Week3, #Standard,@NFLFantasy
Eye Tracking & Applied ML: Soapbox Validations
Anyone who has read my blog (shameless self-plug: http://www.lauraedell.com) over the past 12 years will know, I am very passionate about drinking my own analytical cool-aid. Whether during my stints as a Programmer, BI Developer, BI Manager, Practice Lead / Consultant or Senior Data Scientist, I believe wholeheartedly in measuring my own success with advanced analytics. Even my fantasy football success (more on that in a later post) can be attributed to Advanced Machine Learning…But you wouldn’t believe how often this type of measurement gets ignored.
Introducing you, dear reader, to my friend “Eye-Tracker” (ET). Daunting little set of machines in that image, right?! But ET is a bonafide bada$$ in the world of measurement systems; oh yeah, and ET isn’t a new tech trend – in fact, mainstream ET systems are a staple of any PR, marketing or web designers’ tool arsenal as a stick to measure program efficacy between user intended behavior & actual outcomes/actions.
In my early 20’s, I had my own ET experience & have been a passionate advocate since, having witnessed what happens when you compound user inexperience with poorly designed search / e-commerce operator sites. I was lucky enough to work for the now uber online travel company who shall go nameless (okay, here is a hint: remember a little ditty that ended with some hillbilly singing “dot commmm” & you will know to whom I refer). This company believed so wholeheartedly in the user experience that they allowed me, young ingénue of the workplace, to spend thousands on eye tracking studies against a series of balanced scorecards that I was developing for the senior leadership team. This is important because you can ASK someone whether a designed visualization is WHAT THEY WERE THINKING or WANTING, even if built iteratively with the requestor. Why, you ponder to yourself, would this be necessary when I can just ask/survey my customers about their online experiences with my company and saved beaucorp $$.
Well, here’s why: 9x out of 10, survey participants, in not wanting to offend, will nod ‘yes’ instead of being honest, employing conflict avoidance at its best. Note, this applies to most, but I can think of a few in my new role who are probably reading this and shaking their head in disagreement at this very moment.
Eye tracking studies are used to measure efficacy by tracking what content areas engage users’ brains vs. areas that fall flat, are lackluster, overdesigned &/or contribute to eye/brain fatigue. It measures this by “tracking” where & for how long your eyes dwell on a quadrant (aka visual / website content / widget on a dashboard) and by recording the path & movement of the eyes between different quadrants’ on a page. It’s amazing to watch these advanced, algorithmic-tuned systems, pick up even the smallest flick of one’s eyes, whether darting to or away from the “above-fold” content, in ‘near’ real-time. The intended audience being measured generates the validation statistics necessary to evaluate how well your model fit the data. In the real-world, receiving attaboys or “ya done a good job” high fives should be doled out only after validating efficacy: eg. if customers dwell time increases, you can determine randomness vs. intended actual; otherwise, go back to the proverbial drawing board until earn that ‘Atta boy’ outright.
What I also learned which seems a no-brainer now; people read from Left Top to Right Bottom (LURB). So, when I see anything that doesn’t at LEAST follow those two simple principles, I just shake my head and tisk tisk tisk, wondering if human evolution is shifting with our digital transformation journey or are we destined to be bucketed with the “that’s interesting to view once” crowd instead of raising to the levels of usefulness it was designed for.
Come on now, how hard is it to remember to stick the most important info in that top left quadrant and the least important in the bottom right, especially when creating visualizations for use in the corporate workplace by senior execs. They have even less time & attention these days to focus on even the most relevant KPIs, those they need to monitor to run their business & will get asked to update the CEO on each QTR, with all those fun distractions that come with the latest vernacular du-jour taking up all their brain space: “give me MACHINE LEARNING or give me death; the upstart that replaced mobile/cloud/big data/business intelligence (you fill in the blank).
But for so long, it was me against the hard reality that no one knew what I was blabbing on about, nor would they give me carte blanche to re-run those studies ever again , And lo and behold, my Laura-ism soapbox has now been vetted, in fact, quantified by a prestigious University professor from Carnegie, all possible because a little know hero named Edmond Huey, now near and dear to my heart, grandfather of the heatmap, followed up his color-friendly block chart by building the first device capable of tracking eye movements while people were reading. This breakthrough initiated a revolution for scientists but it was intrusive and readers had to wear special lenses with a tiny opening and a pointer attached to it like the 1st image pictured above.
Fast forward 100 years…combine all ingredients into the cauldron of innovation & technological advancement, sprinkled with my favorite algorithmic pals: CNN & LSTM & voila! You have just baked yourself a popular visualization known as a heat/tree map (with identifiable info redacted) :
This common visual is akin to eye tracking analytics which you will see exemplified in the last example below. Cool history lesson, right?
Even cooler is this example from a travel website ‘Travel Tripper’ which published Google eye-tracking results specific to the hotel industry. Instead of a treemap that you might be used to (akin to a Tableau or other BI tool visualization OOTB), you get the same coordinates laid out over search results in this example; imagine having your website underneath and instead of guessing what content should be above or below the fold, in the top left or right of the page, you can use these tried and true eye tracking methods to quantify exactly what content items customers or users are attracted to 1st and where their eyes “dwell” the longest on the page (red hot).
So, for those non-believers, I say, become a web analytic trendsetter, driving the future of machine design forward (ala “Web Analytics 3.0”).
Be a future-thinker, forward mover, innovator of your data science sphere of influence, always curious yet informed to make intelligent choices.
Wonderful World of Sports: Hey NFL, Got RFID?
As requested by some of my LinkedIn followers, here is the NFL Infographic about RFID tags I shared a while back:
I hope @NFL @XboxOne #rfid data becomes more easily accessible. I have been tweeting about the Zebra deal for 6 months now, and the awesome implications this would have on everything from sports betting to fantasy enthusiasts to coaching, drafting and what have you. Similarly, I have built a fantasy football (PPR) league bench/play #MachineLearning model using #PySpark which, as it turns out, is pretty good. But it could be great with the RFID stream.
This is where the #IoT rubber really hits the road because there are so many more fans of the NFL than there are folks who really grok the “Connected Home” (not knocking it, but it doesn’t have the reach tentacles of the NFL). Imagine measuring the burn-rate output vs. performance degradation of these athletes mid game and one day, being able to stream that on the field or booth for game course corrections. Aah, a girl can only dream…
Is Machine Learning the New EPM Black?
I am currently a data scientist & am also a certified lean six sigma black belt. I specialize in the Big Data Finance, EPM, BI & process improvement fields where this convergence of skills has provided me the ability to understand the interactions between people, process and technology/ tools.
I would like to address the need to transform traditional EPM processes by leveraging more machine learning to help reduce forecast error and eliminate unnecessary budgeting and planning rework and cycle time using a 3 step ML approach:
1st, determine which business drivers are statistically meaningful to the forecast (correlation) , eliminating those that are not.
2nd, cluster those correlated drivers by significance to determine those that cause the most variability to the forecast (causation).
3rd, use the output of 1 and 2 as inputs to the forecast, and apply ML in order to generate a statistically accurate forward looking forecast.
Objection handling, in my experience, focuses on the cost, time and the sensitive change management aspect- how I have handled these, for example, is as such :
- Cost: all of these models can be built using free tools like R and Python data science libraries, so there is minimal to no technology/tool capEx/opEx investment.
- Time: most college grads with either a business, science or computer engineering degree will have undoubtedly worked with R and/or Python (and more) while earning their degree. This reduces the ramp time to get folks acclimated and up to speed. To fill the remaining skill set gap, they can use the vast libraries of work already provided by the R / Python initiatives or the many other data science communities available online for free as a starting point, which also minimizes the time due to unnecessary cycles and rework trying to define drivers based on gut feel only.
- Change: this is the bigger objection that has to be handled according to the business culture and openness to change. Best means of handling this is to simply show them. Proof is in the proverbial pudding so creating a variance analysis of the ML forecast, the human forecast and the actuals will speak volumes, and bonus points if the correlation and clustering analysis also surfaced previously unknown nuggets of information richness.
Even without the finding the golden nugget ticket, the CFO will certainly take notice of a more accurate forecast and appreciate the time and frustration savings from a less consuming budget and planning cycle.
KPIs in Retail & Store Analytics
I like this post. While I added some KPIs to their list, I think it is a good list to get retailers on the right path…
KPIs in Retail and Store Analytics (continuation of a post made by Abhinav on kpisrus.wordpress.com:
A) If it is a classic brick and mortar retailer:Retail / Merchandising KPIs:
-Average Time on Shelf
-Item Staleness
-Shrinkage % (includes things like spoilage, shoplifting/theft and damaged merchandise)
Marketing KPIs:
-Coupon Breakage and Efficacy (which coupons drive desired purchase behavior vs. detract)
-Net Promoter Score (“How likely are you to recommend xx company to a friend or family member” – this is typically measured during customer satisfaction surveys and depending on your organization, it may fall under Customer Ops or Marketing departments in terms of responsibility).
-Number of trips (in person) vs. e-commerce site visits per month (tells you if your website is more effective than your physical store at generating shopping interest)
B) If it is an e-retailer :
Marketing KPIs:
-Shopping Cart Abandonment %
-Page with the Highest Abandonment
-Dwell time per page (indicates interest)
-Clickstream path for purchasers (like Jamie mentioned do they arrive via email, promotion, flash sales source like Groupon), and if so, what are the clickstream paths that they take. This should look like an upside down funnel, where you have the visitors / unique users at the top who enter your site, and then the various paths (pages) they view in route to a purchase).
-Clickstream path for visitors (take Expedia for example…Many people use them as a travel search engine but then jump off the site to buy directly from the travel vendor – understanding this behavior can help you monetize the value of the content you provide as an alternate source of revenue).
-Visit to Buy %
-If direct email marketing is part of your strategy, analyzing click rate is a close second to measuring conversion rate. 2 different KPIs, one the king , the other the queen and both necessary to understand how effective your email campaign was and whether it warranted the associated campaign cost.
Site Operations KPIs / Marketing KPIs:
-Error % Overall
-Error % by Page (this is highly correlated to the Pages that have the Highest Abandonment, which means you can fix something like the reason for the error, and have a direct path to measure the success of the change).
Financial KPIs:
-Average order size per transaction
-Average sales per transaction
-Average number of items per transaction
-Average profit per transaction
-Return on capital invested
-Margin %
-Markup %
I hope this helps. Let me know if you have any questions.
You can reach me at mailto://lauraedell@me.com or you can visit my blog where I have many posts listing out various KPIs by industry and how to best aggregate them for reporting and executive presentation purposes ( http://www.lauraedell.com ).
It was very likely that I would write on KPIs in Retail or Store Analytics since my last post on Marketing and Customer Analytics. The main motive behind retailers looking into BI is ‘customer’ and how they can quickly react to changes in customer demand, rather predict customer demand, remove wasteful spending by target marketing, exceeding customer expectation and hence improve customer retention.
I did a quick research on what companies have been using as a measure of performance in retail industry and compiled a list of KPIs that I would recommend for consideration.
Customer Analytics
Customer being the key for this industry it is important to segment customers especially for strategic campaigns and to develop relationships for maximum customer retention. Understanding customer requirements and dealing with ever-changing market conditions is the key for a retail industry to survive the competition.
- Average order size per transaction
- Average sales per transaction
View original post 278 more words
How Do You Use LinkedIn? (Social Media Infographics)
How often do you refresh your LinkedIn profile pic? Or worse, the content within your profile? Unless you are a sales exec trolling the social networking site or a job seeker, I would surmise not that often; in fact, rarely is most apropos of a description. Thoughts…? ( yes, she’s back ( again), but this time, for good dear readers…@Laura_E_Edell (#infographics) says thanks to designinfographics.com for her latest content postings!
And just because I call it out, doesn’t mean you will know the best approach to updating your LinkedIn profile. And guess what …there’s an infographic for that! (http://www.linkedin.com/in/lauraerinedell)
MicroStrategy World 2012 – Miami
Our internal SKO (sales kick off) meeting was the beginning of this years’ MSTR World conference ( held in Miami, FL at the Intercontinental Hotel located on Chopin Plaza). As with every year, the kickoff meeting is the preliminary gathering of the salesforce in an effort to “rah-rah” the troops who work the front lines around the world ( myself included).
What I find most intriguing is the fact that MicroStrategy is materializing for BI all of those pipe dreams we ALL have. You know the ones I mean : I didn’t buy socialintelligence.co for my health several years ago. It was because I saw the vision of a future where business intelligence and social networking were married. Or take cloud intelligence, aka BI in the cloud. Looking back in 2008, I remember my soapbox discussion of BI mashups, ala My Google, supported in a drag and drop off premises environment. And everyone hollered that I was too visionary, or too far ahead. That everyone wanted reporting, and if I was lucky, maybe even dashboards.
But the acceleration continued, whether adoption grew or not.
Then, i pushed the envelope again: I wanted to take my previous thought of the mashup a morph it into an app integrated with BI tools. Write back to transactional systems or web services was key.
What is a dashboard without the ability within the same interface to take action? Everyone talks about actionable metrics/KPIs. Well, I will tell you that to have a KPI BY DEFINITION OF WHAT A KPI IS, means it is actionable.
But making your end users go to a separate ERP or CRM, to make the changes necessary to affect a KPI, will drive your users away. What benefit can you offer them in that instance ? Going to a dashboard or an excel sheet is no different. It is 1 application to view and if they are lucky, to analyze their data. If they were using excel before , they will still be using excel, especially if your dashboard isn’t useful to day to day operations.
Why? They still have to go to a 2nd application to take action.
Instead, integrate them into one.
Your dashboard will become meaningful and useful to the larger audience of users.
Pipe dream right?
NO. I have proved this out many times now and it works.
Back in 2007-2008, it was merely a theory I pontificated with you, my dear readers.
Since then, I have proved it out several times over and proven the success that can be achieved by taking that next step with your BI platforms.
Folks, if you haven’t done it, do it. Don’t waste anymore time. It took me less then 3 days to write the web services code to consume the salesforce APIs including chatter, ( business “twitter” according to SFDC), into my BI dashboard ( mobile dashboard in fact).
And suddenly, a sales dashboard becomes relevant. No longer does the salesforce team have to view their opportunities and quota achievement in one place, only to leave or open a new browser, to access their salesforce.com portal in order to update where they are at mid quarter.
But wait, now they forgot which KPIs they need to add comments to because they were red on the dashboard which is now closed, and their sales GM is screaming at them on the phone. Oh wait…they are on the road while this is happening and their data plan for their iPad has expired and no wireless connection is found.
What do you do?
Integrating salesforce.com into their dashboard eliminates at least one step (opening a new browser) in the process. Offering mobile offline transactions is a new feature of MicroStrategy’s mobile application. This allows those sales folks to make the comments they need to make while offline, on the road , which will be queued until they are online again.
One stop, one dashboard to access and take action through, even when offline, using their mobile ( android, iPad/iPhone or blackberry ) device.
This is why I’m excited to see MicroStrategy pushing the envelope on mobile BI futures.
MicroStrategy Personal Cloud – a Great **FREE** Cloud-based, Mobile Visualization Tool
Have you ever needed to create a prototype of a larger Business Intelligence project focused on data visualizations? Chances are, you have, fellow BI practitioners. Here’s the scenario for you day-dreamers out there:
OK, and now We are back…rewind to sentence 1 –
Prototyping is to dashboard design or any data visualization design as pencils and grid paper are to me. Mano y mano – I mean, totally symbiotic, right?
But, wireframing is torturous when you are in a consultative or pre-sales role, because you can’t present napkin designs to a client, or pictures of a whiteboard, unless you are showing them the process behind the design. (And by the way, this is an effective “presentation builder” when you are going for a dramatic effect –> ala “first there were cavemen, then the chisel and stone where all one had to create metrics –> then the whiteboard –> then the…wait!
This is where said BI practitioner needs to have something MORE for that dramatic pop, whiz-AM to give to their prospective clients/customers in their leave behind presentation.
And finally, the girl gets to her point (you are always so patient, my loving blog readers)…While I biased, if you forget whom I work for, and just take into account the tool, you will see the awesomeness that the new MicroStrategy Personal Cloud is for (drum roll please) PROTOTYPING a new dashboard — or just building, distributing, mobilizing etc your spreadsheet of data in a highly stylized, graphical means that tell a story far better than a spreadsheet can in most situations. (Yes, neighseyers, I know that for the 5% of circumstances which you can name, a spreadsheet is more àpropos, but HA HA, I say: this cloud personal product has the ability to include the data table along with the data visualizations!)
Best of all it is free.
I demoed this recently and was able to time it took to upload and spreadsheet, render 3 different data visualizations, generate the link to send to mobile devices (iPads and iPhones), network latency for said demo-ees to receive the email with the link and for them to launch the dashboard I created, and guess what the total time was?
Next best of all, it took only 23.7 minutes from concept to mobilization!
Mind you, I was also using data from the prospect that I had never seen or had any experience with.
OK, here is how it was done:
1) Create a FREE account or login to your existing MicroStrategy account (by existing, I mean, if you have ever signed up for the MicroStrategy forums or discussion boards, or you are an employee, then use the same login) at https://www.microstrategy.com/cloud/personal
2) Click the button to Create New Dashboard:
- Now, you either need to have a spreadsheet of data OR you can choose one of the sample spreadsheets that MicroStrategy provides (which is helpful if you want to see how others set up their data in Excel, or how others have used Cloud personal to create dashboards; even though it is sample data , it is actually REAL data that has been scrub-a-dub-dubbed for your pleasure!) If using a sample data set, I recommend the FAA data. It is real air traffic data, with carrier, airport code, days of the week, etc, which you can use to plan your travel by; I do…See screenshot below. There are some airports and some carriers who fly into said airports whom I WILL not fly given set days of the week in which I must travel. If there is a choice, I will choose to fly alternate carriers/routes. This FAA data set will enable you to analyze this information to make the most informed decision (outside of price) when planning your travel. Trust me…VERY HELPFUL! Plus, you can look at all the poor slobs without names sitting at the Alaska Air gate who DIDNT use this information to plan their travel, and as you casually saunter to your own gate on that Tuesday between 3 – 6 PM at SeaTac airport , you will remember that they look so sad because their Alaska Air flight has a 88% likelihood of being delayed or cancelled. (BTW, before you jump on me for my not so nice reference to said passengers), it is merely a quotation from my favorite movie ‘Breakfast at Tiffany’s’ …says Holly Golightly: “Poor cat…poor old slob without a name”.
If using your own data, select the spreadsheet you want to upload
3) Preview your data; IMPORTANT STEP: make sure that you change any fields which to their correct type (either Attribute or Metric or Do Not Import).
Keep in mind the 80/20 rule: 80% of the time, MicroStrategy will designate your data as either an Attribute or Metric correctly using a simple rule of thumb: Text or VarChar/NVarChar if using SQL Server, will always be designated as an Attribute (i.e. your descriptor/Dimension) and your numerals designated as your Metrics. BUT, if your spreadsheet uses ID fields, like Store ID, or Case ID, along with the descriptor like Store DESC or Case DESC, most likely MicroStrategy will assume the Store ID/Case ID are Metrics (since the fields are numeric in the source). This is an Easy Change! You just need to make sure ahead of time to make that change using the drop down indicator arrows in the column headings – To find them, hover over the column names with your mouse icon until you see the drop down indicator arrow. Click on the arrow to change an Attribute column to a Metric column and vice-versa (see screenshot):
Once you finish with previewing your data, and everything looks good, click OK at the bottom Right of your screen.
In about 30-35 seconds, MicroStrategy will have imported your data into the Cloud for you to start building your awesome dashboards.
4) Choose a visualization from the menu that pops up on your screen upon successfully importing your spreadsheet:
Here is the 2010 NFL data which I uploaded this morning. It is a heatmap showing the Home teams as well as any teams they played in the 2010 season. The size of the box is HOW big the win or loss was. The color indicates whether they won or lost (Green = Home team won // Red = Home team lost).
For all you, dear readers, I bid you a Happy New Year. May your ideas flow a plenty, and your data match your dreams (of what it should be) :). Go fearlessly into the new world order of business intelligence, and know that I , Laura E. your Dashboard Design Diva, called Social Intelligence the New Order, in 2005, again in 2006 and 2007. 🙂 Cheers, ya’ll.
Business Intelligence Clouds – The Skies the Limit
I am back…(for now, or so it seems these days) – I promise to get back to one post a month if not more.
Yes, I am known for my frequent use of puns, bordering on the line between cheesy and relevant. Forgive the title. It has been over 110 days since I last posted, which for me is a travesty. Despite my ever growing list of activities both professional and personally, I have always put my blog in the top priority quadrant.
Enough ranting…I diverged; and now I am back.
Ok, cloud computing (BI tools related) seems to be all the rage. Right up there with Mobile
BI, big data and social. I dare use my own term coined back in 2007 ‘Social Intelligence’ as now others have trade marked this phrase (but we, dear readers, know the truth –> we have been thinking about the marriage between social networks / social media data sets and business intelligence for years now)…Alas, I diverge again. Today, I have been thinking a lot about cloud computing and Business Intelligence.
Think about BI and portals, like Sharepoint (just to name 1)…It was all of the rage (or perhaps, still is)…”Integrate my BI reporting with my intranet / portal /Sharepoint web parts…OK, once that was completed successfully, did it buy much in terms of adoption or savings or any number of those ROI / savings catch – “Buy our product, and your employees will literally save so much time they will be basket weaving their reports into TRUE analysis'” What they didnt tell you, was that more bandwidth meant less need for those people, which in turn, meant people went into scarcity mode/tactics trying to make themselves seem or be relevant…And I dont fault them for this…Companies were not ready or did not want to think about what they were going to do with the newly freed up resources that they would have when the panacea of BI deployments actually came to fruition…And so, the wheel turned. What was next…? Reports became dashboards; dashboards became scorecards (became the complements for the former); Scorecards introduced proactive notification / alerting; alerting introduced threshold based notification across multiple devices/methods, one of which was mobile; mobile notification brought the need for mobile BI –> and frankly, and I will say it: Apple brought us the hardware to see the latter into fruition…Swipe, tap, double tap –> drill down was now fun. Mobile made portals seem like child’s play. But what about when you need to visualize something and ONLY have it on a spreadsheet?
(I love hearing this one; as if the multi-billion dollar company whose employee is claiming to only have the data on a spreadsheet didnt get it from somewhere else; I know, I know –> in the odd case, yes, this is true…so I will play along)…
The “only on a spreadsheet” crowd made mobile seem restrictive; enter RoamBI and the likes of others like MicroStrategy (yes, MicroStrategy now has a data import feature for spreadsheets with advanced visualizations for both web and mobile)…Enter Qlikview for the web crowd. The “I’m going to build-a dashboard in less than 30 minutes” salesforce “wait…that’s not all folks….come now (to the meeting room) with your spreadsheet, and watch our magicians create dashboards to take with you from the meeting”
But no one cared about maintenance, data integrity, cleanliness or accuracy…I know…they are meant to be nimble, and I see their value in some instances and some circumstances…Just like the multi-billion dollar company who only tracks data on spreqadsheets…I get it; there are some circumstances where they exist…But, it is not the norm.
So, here we are …mobile offerings here and there; build a dashboard on the fly; import spreadsheets during meetings; but, what happens when you go back to your desk and have to open up your portal (still) and now have a new dashboard that only you can see unless you forward it out manually?
Enter cloud computing for BI; but not at the macro scale; let’s talk , personal…Personal clouds; individual sandboxes of a predefined amount of space which IT has no sanction over other than to bless how much space is allocated…From there, what you do with it is up to you; Hackles going up I see…How about this…
Salesforce.com –> The biggest CRM cloud today. And for the last many years, SFDC has
enbraced Cloud Computing. And big data for that matter; and databases (database.com in fact) in the cloud…Lions and tigers and bears, oh my!
So isnt it natural for BI to follow CRM into cloud computing ?? Ok, ok…for those of you whose hackles are still up, some rules (you IT folks will want to read further):
Rules of the game:
1) Set an amount of space (not to be exceeded; no matter what) – But be fair and realistic; a 100 MB is useless; in today’s world, a 4 GB zip drive was advertised for $4.99 during the back to school sales, so I think you can pony up enough to help make the cloud useful.
2) If you delete it, there is a recycling bin (like on your PC/Mac); if you permanently delete it, too bad/so sad…We need to draw the line somewhere. Poor Sharepoint admins around the world are having to drop into STSADM commands to restore Alvin Analyst’s Most Important Analysis that he not only moved into recycling bin but then permanently deleted.
3) Put some things of use in this personal cloud at work like BI tools; upload a spreadsheet and build a dashboard in minutes wiht visualizations like the graph matrix (a crowd pleasure) or a time series slider (another crowd favorite; people just love time based data 🙂 But I digress (again)…
4) Set up BI reporting on the logged events; understand how many users are using your cloud environment; how many are getting errors; what and why are they getting errors; this simple type of event based logging is very informative. (We BI professionals tend to overthink things, especially those who are also physicists).
5) Take a look at what people are using the cloud for; if you create and add meaningful tools like BI visualizations and data import and offer viewing via mobile devices like iPhone/iPad and Android or web, people will use it…
This isnt a corporate iTunes or MobileMe Cloud; this isnt Amazon’s elastic cloud (EC2). This is a cloud wiht the sole purpase of supporting BI; wait, not just supporting, but propelling users out of the doldrums of the current state of affairs and into the future.
It’s tangible and just cool enough to tell your colleagues and work friends “hey, I’ve got a BI cloud; do you?”
BIPlayBook.Com is Now Available!
As an aside, I’m excited to announce my latest website: http://www.biplaybook.com is finally published. Essentially, I decided that you, dear readers, were ready for the next step. What comes next, you ask?
After Measuring BI data –> Making Measurements Meaningful –> and –>Massaging Meaningful Data into Metrics, what comes next is to discuss the age-old question of ‘So What’? & ‘What Do I Do About it’?
BI PlayBook offers readers the next level of real-world scenarios now that BI has become the nomenclature of yesteryear & is used by most to inform decisions. Basically, it is the same, with the added bonus of how to tie BI back into the original business process, customer service/satisfaction process or really any process of substance within a company.
This is quite meaningful to me because so often, as consumers of goods and services, we find our voices go unheard, especially when we are left dissatisfied. Can you muster the courage to voice your issue (dare I say, ‘complain’?) using the only tools provided: poor website feedback forms, surveys or (gasp) relaying our issue by calling into a call center(s) or IVR system (double gasp)? I don’t know if I can…
How many times do we get caught in the endless loop of an IVR, only to be ‘opted-out’ (aka – hung up on) when we do not press the magical combination of numbers on our keypads to reach a live human being, or when we are sneaky, pressing ‘0’ only to find out the company is one step ahead of us, having programmed ‘0’ to automatically transfer your call to our friend: ‘ReLisa Boutton’ – aka the Release Button().
Feedback is critical, especially as our world has become consumed by social networks. The ‘chatter’ of customers that ensues, choosing to ‘Like’ or join our company page or product, or tweet about the merits or demerits of one’s value proposition, is not only rich if one cares about understanding their customer. But, it is also a key into how well you are doing in the eyes of your customer. Think about how many customer satisfaction surveys you have taken ask you whether or not your would recommend a company to a friend or family member.
This measure defines one’s NPR, or Net Promoter Rank, and is a commonly shared KPI or key performance indicator for a company.
Yet, market researchers like myself know that what a customer says on a survey isn’t always how they will behave. This discrepancy between what someone says and what someone does is as age-old as our parents telling us as children “do not as I do, but as I say.” However, no longer does this paradigm hold true. Therefore, limiting oneself by their NPR score will restrict the ability to truly understand one’s Voice of the Customer. And further, if you do not understand your customer’s actual likelihood to recommend to others or repeat purchase from you, how can you predict their lifetime value or propensity for future revenue earnings? You can’t.
Now, I am ranting. I get it.
But I want you to understand that social media content that is available from understanding the social network spheres can fill that gap. They can help you understand how your customers truly perceive your goods or services. Trust me, customers are more likely to tweet (use Twitter) to vent in 140 characters or less about a negative experience than they are to take the time to fill out a survey. Likewise, they are more likely to rave about a great experience with your company.
So, why shouldn’t this social ‘chatter’ be tied back into the business intelligence platforms, and further, mined out specifically to inform customer feedback loops, voice of the customer & value stream maps, for example?
Going one step further, having a BI PlayBook focuses the attention of the metric owners on the areas that needs to be addressed, while filtering out the noise that can detract from the intended purpose.
If we are going to make folks responsible for the performance of a given metric, shouldn’t we also help them understand what is expected of them up front, as opposed to when something goes terribly wrong, signified by the “text message” tirade of an overworked CEO waking you out of your slumber at 3 AM?
Further, understanding how to address an issue, who to communicate to and most importantly, how to resolve and respond to affected parties are all part of a well conceived BI playbook.
It truly takes BI to that next level. In fact, two years ago, I presented this very topic at the TDWI Executive Summit in San Diego (Tying Business Processes into your Business Intelligence). While I got a lot of stares ala ‘dog tilting head to the side in that confused glare at owner look’, I hope people can draw back on that experience with moments of ‘ah ha – that is what she meant’ now that they have evolved ( a little) in their BI maturation growth.
Gartner BI Magic Quadrant 2011 – Keeping with the Tradition
I have posted the Gartner Business Intelligence ‘BI’ Magic Quadrant (in addition to the ETL quadrant) for the last several years. To say that I missed the boat on this year’s quadrant is a bit extreme folks, though for my delay, I am sorry. I did not realize there were readers who counted on me to post this information each year. I am a few months behind the curve on getting this to you, dear readers. But, what that said, it is better late, than never, right?
Oh, and who is really ‘clocking’ me anyway, other than myself? But that is a whole other issue for another post, some other day.
As an aside, am excited to say that my latest websites http://www.biplaybook.com is finally published. Essentially, I decided that the next step after Measuring BI data, Making the Measurements Meaningful, and Modifying Meaningful Data into Metrics was to address the age old question of ‘So What’? Or ‘What Do I Do About it’?
BI PlayBook offers readers real-world scenarios that I have solved using BI or data visualizations of sorts, but with the added bonus, of how to tie it back into the original business process you were reporting on or trying to help with BI, or tie back into the customer services/satisfaction process. This latter one is quite meaningful to me, because so often, we find our voices go unheard, especially when we complain to large corporations via website feedback, surveys or (gasp) calling into their call center(s). Feedback should be directly tied back into the performance being measured whether it is operational, tactical, managerial, marketing, financial, retail , production and so forth. So, why not tie that back into your business intelligence platforms using feedback loops and voice of the customer maps /value stream maps to do so.
Going one step further, having a BI PlayBook allows end users of your BI systems who are signed up and responsible for metrics being visualized and reported out to the company to know what they are expected to do to address a problem with that metric, who they are to communicate both the issue and the resolution to, and what success looks like.
Is it really fair of us, BI practitioners, to build and assign responisble ownership to our leaders of the world, without giving them some guidance (documented of course), on what to do about these new responsibilities? We are certainly the 1st to be critical when a ‘red’ issue shows up on one of our reports/dashboards/visualizations. How cool would it be to look at these red events, see the people responsible getting alerted to said fluctation, and further, seeing said person take appropriate and reasonable steps towards resolution? Well, a playbook offers the roadmap or guidance around this very process.
It truly takes BI to that next level. In fact, two years ago, I presented this very topic at the TDWI Executive Summit in San Diego (Tying Business Processes into your Business Intelligence). The PlayBook is the documented ways and means to achieve this outcome in a real-world situation.
To Start Quilting, One Just Needs a Set of Patterns: Deconstructing Neural Networks (my favorite topic de la journée, semaine ou année)
How a Neural Network Works:
A neural network (#neuralnetwork) uses rules it “learns” from patterns in data to construct a hidden layer of logic. The hidden layer then processes inputs, classifying them based on the experience of the model. In this example, the neural network has been trained to distinguish between valid and fraudulent credit card purchases.
This is not your mom’s apple pie or the good old days of case-based reasoning or fuzzy logic. (Although, the latter is still one of my favorite terms to say. Try it: fuzzzzyyyy logic. Rolls off the tongue, right?)…But I digress…
And, now, we’re back.
To give you a quick refresher:
Case based reasoning represents knowledge as a database of past cases and their solutions. The system uses a six-step process to generate solutions to new problems encountered by the user.
We’re talking old school, folks…Think to yourself, frustrating FAQ pages, where you type a question into a search box, only to have follow on questions prompt you for further clarification and with each one, further frustration. Oh and BTW, the same FAQ pages which e-commerce sites laughably call ‘customer support’ –
“ And, I wonder why your ASCI customer service scores are soo low Mr. or Mrs. e-Retailer :),” says this blogger facetiously, to her audience .
And, we’re not talking about fuzzy logic either – Simply put, fuzzy logic is fun to say, yes, and technically is:
–> Rule-based technology with exceptions (see arrow 4)
–> Represents linguistic categories (for example, “warm”, “hot”) as ranges of values
–> Describes a particular phenomenon or process and then represents in a diminutive number of flexible rules
–> Provides solutions to scenarios typically difficult to represent with succinct IF-THEN rules
(Graphic: Take a thermostat in your home and assign membership functions for the input called temperature. This becomes part of the logic of the thermostat to control the room temperature. Membership functions translate linguistic expressions such as “warm” or “cool” into quantifiable numbers that computer systems can then consume and manipulate.)
Nope, we are talking Neural Networks – the absolute Bees-Knees in my mind, right up there with social intelligence and my family (in no specific order :):
–> Find patterns and relationships in massive amounts of data that are too complicated for human to analyze
–> “Learn” patterns by searching for relationships, building models, and correcting over and over again model’s own mistakes
–> Humans “train” network by feeding it training data for which inputs produce known set of outputs or conclusions, to help neural network learn correct solution by example
–> Neural network applications in medicine, science, and business address problems in pattern classification, prediction, financial analysis, and control and optimization
Remember folks: Knowledge is power and definitely an asset. Want to know more? I discuss this and other intangibles further in part 1 of a multi-part study I am conducting called:
Measuring Our Intangible Assets, by Laura Edell
Investigative Analysis Part 1: Quantifying the Market Value of an Organization’s Intangible Asset Known as ‘Knowledge’
OK, so I’ve decided to conduct another multi-part study similar to what I did last year.
This time, I will be analyzing and attempting the quantify an organization’s intangible assets. Specifically, the following:
• knowledge, brands, reputations, and unique business processes
So, starting with knowledge: Firstly, the chart is a little outdated but I will source the last two years and updated the graph later in the series. Regardless, it is interesting none-the-less. And since I am the Queen advocate for measuring what matters and managing what you can measure, then consider the following my attempt to drink my own cool-aid – the following chart depicts revenue growth over a 7 year period ending in 2008 – Those of you, my dear readers, who are also fellow Business Intelligence practitioners, should be able to attest at first glance to this statistical representation of Content Management Systems (CMS) and Portals YoY Revenue growth.
In fact, many of us have been asked to integrate BI dashboards and reports into existing corporate portals, like Microsoft SharePoint or into the native portals bundled with most Enterprise grade BI products like MicroStrategy or SAP/Business Objects, right? Many of us have been tasked with drafting data dictionaries, data governance documentation, source protected project and code repositories; ie – knowledge capture areas. But even in my vast knowledge (no pun intended), I was unaware that the growth spurt specific to CMS’ was as dramatic as this, depicted below and sourced from Prentice Hall
In fact, between 2001 and 2008, CMS’ revenue growth went from ~$2.5B to ~$22B, with the greatest spurt beginning in 2003 and skyrocketing up from there.
Conversely, the portal revenue growth was substantially less. This was a surprise. I must have heard the words SharePoint and Implementation more than any other between 2007 – 2009, whereas the sticker shock that came with an enterprise grade CMS sent many a C-level into the land of Nod, never to return until the proven VALUE cloud could ride them home against the nasty cop known as COST.
Aah – Ha moment, folks. Portal products were far less costly than the typical Documentum or IBM CMS.’
In fact, Jupiter’s recent report on CMS’ stated
“In some cases, an organization will deploy several seemingly redundant systems. In our sampling of about 800 companies that use content management packages, we discovered that almost 15 percent had implemented more than one CMS, often from competing vendors. That’s astounding, especially when you consider that an organization that deploys two content management systems can rack up more than $1 million in licensing fees and as much as $300,000 in yearly maintenance costs. Buying a second CMS should certainly raise a red flag for any CIO or CFO about to approve a purchase order.”
That’s 120 companies from the Jupiter study spending $1M in licensing, or $120M baseline. Extend that to all organizations leveraging CMS technology and therein lies the curious case of the revenue growth spurt.
To that, I say, Kiss My Intangible Assets! Knowledge is power, except when parked in someone’s head – Now, when will someone invent the physical drainage system for exactly said knowledge with or without permission of said holder? This gatekeepers need to go, and are often the dinosaurs fearing the newbie college grads and worst of all, CHANGE.
In part 2, we will discuss another fave of mine: Brand You!
Sideline Comparative Predictions: Gartner’s 2010 Technology Trends
As promised in my last blog post, here is the comparison list of predictions for Top 10 Strategic Technologies for 2010 – I’ve highlighted the 2nd trend on Gartner’s list for this drum has been beaten by yours truly for years now, only to be shot down and sometimes embraced for said belief that advanced analytics are where the value within BI truly lay, and those who adopt now, will beat the curve of the trend and reap the ill-gotten rewards long due to companies who have invested millions into BI programs without realizing much gain (*no matter which service implementer was used, although {shameless plug} if reader had used Mantis Technology Group (my company), this would be moot as you would be reveling in the realized value that we bring since yours truly, is an employee and implementer of these very BI systems). When it comes to the broad realm of BI or the facets within BI, like social intelligence *(another prediction)*, advanced analytics or cloud computing *(yet another prediction)*, Mantis excels at infusing value into even the smallest scale implementation – Having come from being the client to now the service provider, I have worked with the very largest and those that claim to be the best, down to the niche providers like ourselves on a slightly bigger scale…I say with all earnestness that Mantis’ offering truly stands above those in both spaces that I had previously hired, often left with that disappointing feeling when one realizes that they did not get what they expected, and when they confront those who provided the end result they got, often being lead down the “lets get the SOW and look at what you asked for route” which never ends well…Clients, such as myself in my former life, often don’t know what they don’t know especially when implementing technologies that may not be something they are well versed – As I have belabored and will do again quickly now, it is up to the service provider to hang up their $$ hat and help the client understand enough to be dangerous and make educated choices, not just those that will return the greatest financial gains, but those that will truly help deliver on the value proposition that IS POSSIBLE from well implemented BI programs.
As said before, please share your predictions, comments or anecdotes with our readership. I (we) would love to hear your opinion too!
The top 10 strategic technologies as predicted by Gartner for 2010 include:
Cloud Computing. Cloud computing is a style of computing that characterizes a model in which providers deliver a variety of IT-enabled capabilities to consumers. Cloud-based services can be exploited in a variety of ways to develop an application or a solution. Using cloud resources does not eliminate the costs of IT solutions, but does re-arrange some and reduce others. In addition, consuming cloud services enterprises will increasingly act as cloud providers and deliver application, information or business process services to customers and business partners.
Advanced Analytics. Optimization and simulation is using analytical tools and models to maximize business process and decision effectiveness by examining alternative outcomes and scenarios, before, during and after process implementation and execution. This can be viewed as a third step in supporting operational business decisions. Fixed rules and prepared policies gave way to more informed decisions powered by the right information delivered at the right time, whether through customer relationship management (CRM) or enterprise resource planning (ERP) or other applications. The new step is to provide simulation, prediction, optimization and other analytics, not simply information, to empower even more decision flexibility at the time and place of every business process action. The new step looks into the future, predicting what can or will happen.
Client Computing. Virtualization is bringing new ways of packaging client computing applications and capabilities. As a result, the choice of a particular PC hardware platform, and eventually the OS platform, becomes less critical. Enterprises should proactively build a five to eight year strategic client computing roadmap outlining an approach to device standards, ownership and support; operating system and application selection, deployment and update; and management and security plans to manage diversity.
IT for Green. IT can enable many green initiatives. The use of IT, particularly among the white collar staff, can greatly enhance an enterprise’s green credentials. Common green initiatives include the use of e-documents, reducing travel and teleworking. IT can also provide the analytic tools that others in the enterprise may use to reduce energy consumption in the transportation of goods or other carbon management activities.
Reshaping the Data Center. In the past, design principles for data centers were simple: Figure out what you have, estimate growth for 15 to 20 years, then build to suit. Newly-built data centers often opened with huge areas of white floor space, fully powered and backed by a uninterruptible power supply (UPS), water-and air-cooled and mostly empty. However, costs are actually lower if enterprises adopt a pod-based approach to data center construction and expansion. If 9,000 square feet is expected to be needed during the life of a data center, then design the site to support it, but only build what’s needed for five to seven years. Cutting operating expenses, which are a nontrivial part of the overall IT spend for most clients, frees up money to apply to other projects or investments either in IT or in the business itself.
Social Computing. Workers do not want two distinct environments to support their work – one for their own work products (whether personal or group) and another for accessing “external” information. Enterprises must focus both on use of social software and social media in the enterprise and participation and integration with externally facing enterprise-sponsored and public communities. Do not ignore the role of the social profile to bring communities together.
Security – Activity Monitoring. Traditionally, security has focused on putting up a perimeter fence to keep others out, but it has evolved to monitoring activities and identifying patterns that would have been missed before. Information security professionals face the challenge of detecting malicious activity in a constant stream of discrete events that are usually associated with an authorized user and are generated from multiple network, system and application sources. At the same time, security departments are facing increasing demands for ever-greater log analysis and reporting to support audit requirements. A variety of complimentary (and sometimes overlapping) monitoring and analysis tools help enterprises better detect and investigate suspicious activity – often with real-time alerting or transaction intervention. By understanding the strengths and weaknesses of these tools, enterprises can better understand how to use them to defend the enterprise and meet audit requirements.
Flash Memory. Flash memory is not new, but it is moving up to a new tier in the storage echelon. Flash memory is a semiconductor memory device, familiar from its use in USB memory sticks and digital camera cards. It is much faster than rotating disk, but considerably more expensive, however this differential is shrinking. At the rate of price declines, the technology will enjoy more than a 100 percent compound annual growth rate during the new few years and become strategic in many IT areas including consumer devices, entertainment equipment and other embedded IT systems. In addition, it offers a new layer of the storage hierarchy in servers and client computers that has key advantages including space, heat, performance and ruggedness.
Virtualization for Availability. Virtualization has been on the list of top strategic technologies in previous years. It is on the list this year because Gartner emphases new elements such as live migration for availability that have longer term implications. Live migration is the movement of a running virtual machine (VM), while its operating system and other software continue to execute as if they remained on the original physical server. This takes place by replicating the state of physical memory between the source and destination VMs, then, at some instant in time, one instruction finishes execution on the source machine and the next instruction begins on the destination machine.
However, if replication of memory continues indefinitely, but execution of instructions remains on the source VM, and then the source VM fails the next instruction would now place on the destination machine. If the destination VM were to fail, just pick a new destination to start the indefinite migration, thus making very high availability possible.
The key value proposition is to displace a variety of separate mechanisms with a single “dial” that can be set to any level of availability from baseline to fault tolerance, all using a common mechanism and permitting the settings to be changed rapidly as needed. Expensive high-reliability hardware, with fail-over cluster software and perhaps even fault-tolerant hardware could be dispensed with, but still meet availability needs. This is key to cutting costs, lowering complexity, as well as increasing agility as needs shift.
Mobile Applications. By year-end 2010, 1.2 billion people will carry handsets capable of rich, mobile commerce providing a rich environment for the convergence of mobility and the Web. There are already many thousands of applications for platforms such as the Apple iPhone, in spite of the limited market and need for unique coding. It may take a newer version that is designed to flexibly operate on both full PC and miniature systems, but if the operating system interface and processor architecture were identical, that enabling factor would create a huge turn upwards in mobile application availability.
“This list should be used as a starting point and companies should adjust their list based on their industry, unique business needs and technology adoption mode,” said Carl Claunch, vice president and distinguished analyst at Gartner. “When determining what may be right for each company, the decision may not have anything to do with a particular technology. In other cases, it will be to continue investing in the technology at the current rate. In still other cases, the decision may be to test/pilot or more aggressively adopt/deploy the technology.”
Article copied from: http://www.gartner.com/it/page.jsp?id=1210613
Sideline Topic: Looking for Feedback on What YOU THINK about CMS’ 2010 Technology Predications
Can it be true that finally, in 2010, the market focus within the technology sector will finally shift to customer-facing systems and internal applications delivering more meaningful content applicability?
Looking back at the content of my own blog, me and my readers (thank you, lovely readers!) have been feeling the need for business intelligence to step back into customer intelligence once again, a place we (BI practitioners) have been for a while. And while, we, the go forward and capture the world Gen X, Y and Z’ers have shifted the realm of what we need in terms of content delivery (these are the generations of the “serve it up TO US in a Google-style fashion, otherwise, I am too busy to look for the information on your website” crowds where texting is the preferred vehicle for communications & anything that requires more than two hops deep to get to the information we need is one step too many – sad, but true ; and those that realize this fact of life now will adjust and survive when this generation, now in college, graduates and enters into our realm of the workplace). And so I bring you CMS’ predications, followed by the tried and true Gartner predications for comparison sake – Please let me know what you think, what your own predications are or any other comments you want to share! J’accueille un nouvel an – how about you? 🙂
Article copied from : http://www.information-management.com/news/ecm_serach_cloud_sharpoint_mobile_document_management-10016801-1.html?msite=cloudcomputing
The current recessionary period in particular will yield many content technology investments focused on customer-facing systems, CMS Watch founder, Tony Byrne was quoted to say. “In 2010 we will see a renewed focus on internal applications.”
- Enterprise content management and document management will go their separate ways.
- Faceted search will pervade enterprise applications.
- Digital asset management vendors will focus on SharePoint integration over geographic expansion.
- Mobile will come of age for document management and enterprise search.
- Web content management vendors will give more love to intranets.
- Enterprises will lead thick client backlash.
- Cloud alternatives will become pervasive.
- Document services will become an integrated part of enterprise content management.
- Gadgets and Widgets will sweep the portal world.
- Records managers face renewed resistance.
- Internal and external social and collaboration technologies will diverge.
- Multilingual requirements will rise to the fore.