In a rare API conference event appearance, Twitter graced the stage at last week’s APIDays Paris. The social giant shared some insight into current API usage among third-party developers and gave some read-between-the-lines signs of how it intends to work with API partners in future.
Paris may have been a strategic location for Twitter to present on its API, given its absence from other recent API-specific conferences including API Strategy and Practice, API World and even the Twilio conference, which had been able to snare some notable keynote presenters for their November API leadership event. Romain Huet, from Platform Relations at Twitter, spoke last week about how Twitter is “Connecting to the pulse of the planet” in a decidedly non-U.S focused presentation.
Acknowledging that the majority (77%) of Twitter users are located outside of the United States, Huet walked the audience through some of the social media giant’s recent international engagement achievements. In particular, he referenced how Twitter was one of the only on-the-ground news sources during the Turkey riots, and was a key media player helping connect relief efforts to areas of need in the recent Philippines disaster. Keeping the global theme, Huet went on to provide current examples of how European and UK businesses are making use of Twitter APIs in their business model.
Reading between the lines of the presentation, it appears some of the take-home messages for API developers wanting to make use of Twitter’s data “firehose” are best placed to create high value-add products for specific target audiences. The move away from encouraging developers to create Software-as-a-Service or other truly scalable solutions based primarily on the Twitter API is now complete. Developers, it seems, are encouraged instead to identify opportunities that access Twitter data streams in realtime to provide valuable insights for specific industry verticals.
While some examples showcased by Huet did have a scalable component to them, for the most part what the Twitter rep seems to be saying is that using the Twitter APIs to create an automated service is not what the company is looking for in new API partnerships. Following the model being used by Pinterest’s recently released APIs, Twitter’s newest API product, custom timelines, is only available by submitting an application detailing how prospective partners would like to make use of the API to add value to their existing industry relationships, rather than being provided as an open API and encouraging the creation of a slew of new third-party apps.
Huet pointed to several examples of how the Twitter API is being used by global startups.
Vigiglobe: Vigiglobe uses social media analysis, linguistics algorithms and data mining to create real-time dashboards and analytics products for brands, sports, political observers, TV broadcasters and other specific verticals. The business model seems to be based on a consultancy fee structure.
Electionista: Electionista aims to help users understand political trends by analyzing political tweets in real time against other data mining to provide context. While a scalable, automated service is available via its website (with a pro version also creating a business revenue stream), the bulk of Electionista’s business model seems to be based on providing project-based data-driven products, consultancy and training services for specific customers, including party election teams and media.
Comenta: Comenta targets TV and live event broadcasters. It offers a paid service to customers that allows them to enhance engagement during live events, sports, performances and broadcasts of specific TV programs. Like Vigiglobe and Electionista, the service uses the Twitter Streaming API to provide higher-level, sophisticated analysis of social media conversations alongside other data mining to create analytics tools and visualizations for clients.
None of these services has a business model that primarily focuses on scaling an automated service from the Twitter API (although Electionista’s Pro service has that option in part, it seems it is just as much about creating a gateway service to build relationships with potential consultancy clients). Instead, each is focused on targeting a particular market segment for detailed analytics, using the Twitter Streaming API to provide real-time analysis, often when combined with other data sets that provide a greater context to what is happening.
Continuing a global case study theme, Huet also showed how UK startup Style on Screen is using the Twitter API. Here, monetization appears to be from affiliate sales links for product details shared via Twitter (we have requested confirmation of this business model from Style on Screen, but at time of publication haven’t heard back from the startup). Where a TV viewer wants more information about the fashions worn on their favorite TV program, they can tweet Style on Screen for more product details and will be sent a reply link via Twitter indicating where the item can be purchased.
In a life-imitating-art moment, Huet gave an example of how an outfit worn by a character on The Mindy Project drove new retail clothing sales based on the Twitter API-enabled Style on Screen app. Some of the writers of The Mindy Project previously worked on 30 Rock, where the lead character once quipped that she wanted an app to buy things she saw on TV: “Like if you’re watching Sex and The City and you just have to have Mr. Big’s spaghetti.”
Huet describes the Twitter APIs as having two distinct purposes: “Streaming APIs help you ingest what is happening right now while our REST APIs allow you to perform actions and review what has happened previously on the platform,” Huet said.
Huet points to specific parameters that developers can use with the Streaming API. For example, the follow option lets developer-consumers filter all tweets on a specific subject by filtering by Twitter ID; or to filter by track, i.e. hashtag. Developers can also set streaming limits to only return tweets sent within a defined set of location borders. The Twitter Streaming API also includes the entities parameter which enables data to be returned on mentions, retweets, un-shortened URL links (for more details on how to use Twitter APIs, check out our recent Twitter API tutorials).
For developers who want to analyze all Twitter conversations instead of starting with a set of parameter filters, it is possible to conduct your own big data analysis by streaming what Huet calls “a sample of the firehose” (internally, the Twitter API team has been calling this the ‘garden hose’). This is a “digestible amount of big data” (1% of all tweets per day are provided via this API feature). Finally, Huet did indicate that some partners – like Topsy Analytics and DataSift – are able to access “the full firehouse” of all tweets, but these are specifically arranged business deals unlikely to be opened up to anyone who wants to have a play with big data and the social media platform.
Huet recommended API developers familiarize themselves with github.com/twitter for open source contributions of how people are streaming Twitter data.
Huet also pointed to the new custom timelines API product now in private beta release. Huet encouraged developers attending APIDays to submit an application for access to this new data service for those who had a clear use case in mind. Similar to how Pinterest is opening its API to select partners, the approach seems to be that applicants need to first demonstrate where they are positioned in their specific industry vertical and to explain how they plan to make use of the API service.
Entrepreneurs will need to assess how to share this information while possibly still maintaining first-mover advantage, although the focus seems to be on working with established brands who are looking for additional engagement strategies. Given the theme of Huet’s talk, being able to demonstrate global reach or accenting use of the custom timelines feature with markets outside the United States may be a leverage ploy to consider using if making an application. Again, developers applying for access may want to focus on how they would use the functionality to provide greater value to their existing customer base, or to deepen analytics insights in specific verticals, rather than creating a Storify-type automated service that can scale.
The session ended with what we expect to be a 2014 conference trend of finishing your presentation with a crowd-pleasing demo of how to use an API to connect your data to a drone.
With other discussions at APIdays focusing on exploring what exactly is the true nature of open APIs, the partnership model being proposed by Twitter (and Pinterest) may be the way some bigger API providers will move to provide “open” access to their data stream in the future. In such cases, understanding how the API provider is framing the supply of data via API will be critical in submitting an application for access that can demonstrate alignment with the API providers future business goals.