Mindshare Technologies is attending the Sentiment Analysis Symposium in New York City and will be updating this post live through out the day. Please join Kurt Williams (CTO) and Jon Sanderson (VP of Marketing) as we engage with some of the industry's brightest minds discussing the application of text analytics and sentiment analysis. -5:10 PM - Kurt Williams: Paolo Giacomazz, Politecnico di Milano. Speaking on Milan Goes Social: Customer Experience, Reputation, and Semantics for Tourism. He related an interesting story where the Mila Tourism group performed a survey with the question, "Did you come to Milan for Arts and Culture?" Respondents, almost insulted, always selected "Yes!" However, on analyzing Twitter data, quite the opposite was revealed. In other words, capturing the data can influence the results. Because Twitter is voluntary, there is no question phrasing bias. During the process, Milan found that the analysis of individual comments must be done semantically. Milan identified the "drivers" of both volume and sentiment. They used this to guide their overall analysis. Identifying drivers of sentiment is worthwhile. -4:39 PM- Kurt Williams: Josh Merchant, CTO & Co-founder, Lymbix, Inc. The Right Formula for Sentiment Success: Emotion, Clarity, and Intensity. Learning + (Asking + Learning) = Sentiment x Subjectivity. There is a perception that language analysis such as sentiment is impossible, but in fact there are not a limitless number of combinations of words and meanings. It is a solvable problem. After that, ask the right questions. Avoid binary questions - yes/no doesn't give enough data. How much yes? How much no? Ask better questions. Embrace being wrong as there is incredible power in knowing for certain that you were wrong. On subjectivity: Subjectivity and accuracy go hand in hand because determining whether or not text analytics is accurate is actually subjective. -2:56 PM- Kurt Williams: K.D. Paine, CEO KDPaine & Partners: Beyond Sentiment: How to Really Get Value out of an Automated System. Thanks first to the Web and later (and in greater magnitude) Social Media, the volume of noise is increasing, which is why people are falling over themselves to automate everything. Sentiment analysis technology is improving but far from perfect and some use cases are better than others. The social media dialog is uncontrollable and cross-cuts every department, so it can't be a simple initiative within just PR or just marketing. One area automation shines is in categorizing content and sending it to the right department. A big question should be "so what?" Integrated BI is the future but most companies are currently stuck at the "pretty charts" phase and haven't figured out what to do with all the data. Examining limitations of sentiment analysis: Sentiment Analysis assumes sentiment exists. most businesses do not actually evoke strong enough sentiment to measure. (Fortunately, customer-employee interactions DO generate lots and lots of strong sentiment data.) Using humans to manually code can actually be much more accurate at a lower price point for certain problems and data volumes. Don't rule out human coding just because it isn't techy and new. "Let's move beyond positive and negative (sentiment) to a study of relationships." In the end, "it doesn't matter if it doesn't produce results." Examine why your are interested in positive/negative sentiment in the first place. Do more than just watch the little red lines go down and the little green lines go up. Also, predictive text analytics won't work for everyone. "This will never work for Raytheon. No amount of Twitter analysis will ever lead to the sale of a missile system." "I do think social media can have a big impact on business from a process improvement perspective." "The best way to measure trust (and many other types of sentiment) is to call people up and use a survey to ask them how they feel." -2:18 PM- Kurt Williams: Armando Gonzalez, CEO RavenPack: On Computers that Trade off the News. First, we all take risks. Risk can be mitigated somewhat by using computers to sift information. Big problem: computers can't read text, they read numbers. People might be better in some ways at classifying text, but computers are faster, more consistent, are more immune to noise, and can detect trends and patterns. People can get tired and computers do not. Computers can systematically detect large scale significant events in the news in milliseconds. Use a combination of technologies and approaches such as sentiment analysis, categorization (into predetermined buckets), sentiment momentum and reversal analysis. It turns out that negative news is a stronger leading indicator of stock performance. Negative news has more impact (is it causal or just predictive?) Consider using a "Sentiment Index" to gauge average sentiment for an entire category of content. This allows future measurements to be compared to the index. For example, if a given comment is has a sentiment score that is significantly above or below the index. Stock traders can also use text analysis to perform surveillance for insider trading, for example if traders are shorting stock hours before significant negative news breaks. -1:33 PM- Kurt Williams: Rich Brown, Global Business Manger machine Readable News, Thompson Reuters: News flow can predict volume and volatility. Highly positive and negative news matters more. Use analytics to sift content. Analyzing twitter alone is not enough to "generate alpha" (beat the market). There isn't enough breadth and too much noise in Twitter. Consider using all content sources available. "Big volume" does not equate to "big picture." -12:19 PM- Kurt Williams: Lipika Dey, Senior Consultant, Tata Consultancy Services: Discussed the TCS Listening platform, which uses goal-driven sentiment analysis to understand the business problems. Good advice there: begin with the end in mind. Don't just do sentiment analysis; have a plan for where you are going with it. Shape it using the business processes and actions you plan to take, not just the entities being talked about in the text. This seems to be a secondary theme at the conference. -12:13 PM- Kurt Williams: Lawrence Rafsky Ph.D., CEO, Acquire Media: Speaking on the effects of text on stock prices. What do human readers do with sentiment analysis, vs. machine based analysis. For example, in many cases the computer will mis-identify the sentiment of company briefings and PR announcements and buy/sell stock inappropriately. Once the humans take back over, the problem can correct itself, but how do you fix this? Semantics can be a problem. For example, an electronics magazine talking about "router tables" vs. a woodworking magazine talking about "router tables." Context matters! Another example: Jets, firemen, and boxers are all "fighters." In the future, companies have a legal and ethical responsibility to pre-analyze their releases for sentiment in order to prevent problems. -12:07 PM- Kurt Williams: Marguerite Leenhardt, Paris 3 University; Using Textometry to leverage opinion analysis. It can be used to cluster authors who share similar opinions together. One approach for improving opinion mining, rather than starting with the individual leveling phrases, start with the context of the conversation first. In other words, many approaches often skip the step of analyzing the context of the text. -12:02 PM- Kurt Williams: Seth Earley, CEO Earley & Associates: For Social Media Monitoring and Sentiment Analysis, focus on Monitor,Measure, and Engage. Use taxonomy to map terminology relationships. Sentiment analysis likes to "lump" things together into large categories, Marketing analysis likes to "split" things into smaller categories. In the end, "are marketing efforts and sentiment analysis actually making people act?" Focus on the business effect. Make it operational. -11:56 AM- Kurt Williams: Karo Moilanen, Oxford University: "Current sentiment analysis technologies are far from perfect because they are shallow and partial in their analysis." The goal is to use "Deep Sentiment Analysis" which leverages deep NLP and gramatical analysis to produce human-like reasoning. It is important to have "fully traceable output" to understand "why" the machine chose a particular result. Results can be improved by applying "relational sentiment analysis" where the relationship between entities is analyzed for sentiment. For example, a particular brand of car is discussed by drivers with opinions about their driving experience. Even though current tools are lacking, deeper analysis is already possible. Greater coverage is available. Check out "TheySay" demo. -11:51 AM- Kurt Williams: Steve Pettigrew, Technical Product Manager, OpenText: Semantic Navigation can be used to mine a web site. For example, it can be used to create a content mashup, combining similar content together on one web page. -11:45 AM- Kurt Williams: Fiona McNeill, Global Product Marketing, SAS: What tools and people do you have available to you already to work with text and sentiment analysis? Don't use sentiment analysis to do what you already do well. Find ways to integrate sentiment into what you already do well. Find areas that can benefit from polarity (positive/neutral/negative and remember people rarely mention just one thing [KGW: we've seen that at Mindshare]), trend impacts and the effects of how things change over time, the "echo effect" of sentiment, and lag analysis (how long before the actions of your company affect sentiment). -11:40 AM- Kurt Williams: Bryan Bell, VP Enterprise Solutions, Expert System: Semantic intelligence builds on sentiment analysis to create a "semantic network" that uses the relationship between terms to establish meaning and significance. 11:40 AM- Kurt Williams: The next few posts will be covering Lightning Talks, which are fast 5-minute presentations. -11:10 AM- Kurt Williams: Two years ago there was little interest in Sentiment Analysis. This is changing. Opinion identification and retrieval is different from simply classifying. Not everything has to be done using the most advanced technology out there. Sometimes simple approaches work fine. For example, eBay was able to use simple keyword counting on Twitter to detect when eBay was down. You don't always need super-sophisticated NLP. Do what works. -11:00 AM- From Twitter:
- Kurt William: "Ebay creatively leveraged sentiment mining on Tweets to detect eBay web site outages. Clever! #SAS11"
- Neil Glassma: "What do you mean I can't apply the same sentiment analysis vocabulary for camcorder & makeup discussions? #SAS11"
- Kurt Williams: "Ebay was able create a product review classifier that achieved 90% accuracy, even with a high degree of noise. #SAS11"