- What is NLP?
- Deep Learning
- White Papers
The app is called Him, a play on the recent movie Her. We were inspired to build this app the day before the HackTech after we overheard some girls sitting in our lounge complaining that boys never respond to texts when they need to. The app is not gender biased, and can be used by anyone. By switching on the app's service, Him monitors your texts and auto replies, keeping the person on the other end entertained in a simple conversation. This way, no one gets upset by non-responsiveness. The user of Him can see what texts Him sends as well if, for some reason, he wants to cut Him off and respond personally.
We used AlchemyAPI's keyword extraction and entity extraction APIs. Him is programmed in Java (it's a native Android app) and we downloaded the library provided by AlchemyAPI for the android SDK. The keyword extraction picks up simple words and the entity extraction picked up names, so that we could base our auto responses around some kind of meaningful context.
Currently, the app holds a simple conversation for you. We can't guarantee the correctness of its replies (it could accidentally set up a date for you at 5pm, or tell your friend to work out at the break of dawn). We'll either decide to just put some finishing touches on it for release, or dig deeper to see if we can sync it with things like Evernote and Google Calendar for a full artificial intelligence system. This hack placed in the top 10 hacks at HackTech, so obviously more people than just the girls in our lounge want this technology.
Joraaver Chahal: I'm currently a sophomore at UCLA studying CS&E (computer science and electrical engineering). It's funny if you don't see me coding, reading about code, writing about code, or working out in my free time. I'm a somewhat seasoned Ruby on Rails developer (with knowledge of HTML5, CSS3, and JS), Android app developer, and an aspiring indie game developer (released my first game a month ago)! Currently, I'm reading Hacking, the Art of Exploitation, 2nd edition by Jon Erickson, I've begun dabbling in the realm of data science realm with R, and I'm working with the Robotics Club at UCLA for a competition this spring.
Arjun Lakshmipathy: I’m a second year Computer Science and Engineering and Mathematics/Economics double major at UCLA. I must say that I gained my love for hacking a bit later than my colleague here but man is it fun! I’ve built an iOS app, experimented with Android development, done back and front-end development, and plan to try my hand at game development this summer. I want to start my own company some day and am also a private pilot in training.
As a graphics artist (as well as hacker) in training, my twin, Suchaaver Chahal, deserves a shoutout. Studying EECS at UC Berkeley, he provided us with our super cool logo for Him! Also, shout out to Shubhankar Jain if you are reading this. Thanks for suggesting this awesome API!
Thanks for this easy-to-use NLP API! We had no experience with NLP coming into this hackathon, but your API helped set us up for the more important tasks during the hackathon.
We've recently released a case study that shows how Triberr, a leading blog content aggregator, uses AlchemyAPI to classify high volumes of blog content and rank their relevance to reader's needs.
According to Triberr Founder, Dan Cristo:
“One of the features that we really like is the keyword relevance ranking, because whether people on our site are searching for bloggers or content, being able to rank it in order of highest relevance is of course very important.”
Learn more about how AlchemyAPI helps power Triberr's core search functionality by downloading the full Triberr case study.
A team of students from Northwestern's Collaborative Innovation in Journalism and Technology class, (Kelly Gonsalves, Salil Gupta, Alex Sanz presented ), recently presented their media prototype, OpShop. Some of these promising prototypes will be further developed by Knight Lab, "a team of technologists, journalists, designers and educators working to advance news media innovation through exploration and experimentation."
OpShop's goal is to be a congressional voice box and to reduce "political polarization by providing information on multiple viewpoints of an issue." To accomplish this task, the team used AlchemyAPI and Bing Search API. The result is an app that can take the URL of any news article, and return quotes on that topic from four members of Congress who each hold varying political positions (extremely liberal, moderately liberal, moderately conservative, and extremely conservative) along with the headlines and links to the articles they were pulled from.
The team will continue to work on OpShop, with the these key next steps: "A more targeted selection of public figures (the members of Congress are currently randomly selected), and an analysis of dimensions beyond liberal vs. conservative."
Watch the archived live stream here.
A member of AlchemyAPI's developer community, Matthew Conlen, created an app to answer the following question:
"Given a near real-time stream of breaking news headlines, can we algorithmically determine the physical location of the events as they happen, along with some key terms or players associated with the stories? If so, what how can that information be used?"
To answer this question, he uses tweets from the BreakingNews Twitter account, processes those tweets using AlchemyAPI's entity extraction API and then turns any location entity names into latitude-longitude bounding boxes using the Google Geoencoding API.
Conlen uses the results of the Geoencoding API to get real-time updates on the story taking place at the location of the event. Finally, he determines if the source is reputable using several criteria (list of news org Twitter accounts, bio keywords, follower count, etc.) he admits are not scientific, "but do seem to improve results."
According to Conlen, the result is Onsite, "a working website that can with no oversight on my part keep up to date on important news happening around the world, and also provide real-time context and updates from people tweeting at the site of the event."
Text analysis can be applied to a wide range of applications and industries. Two of the most common and powerful are social media monitoring and social media analytics. These applications are about gathering information generated on social media outlets, then analyzing the data for insights and trends.
Since many of our customers have created applications to monitor social media and create analytics, we’re fortunate to have learned a lot about this space. To share some of that knowledge, we released a social media monitoring solution page in our resources section, and the first tutorial in our developers section.
Text Analysis for Social Media Monitoring and Social Media Analytics provides a high-level overview of how social media monitoring works, and addresses two challenges of social media monitoring: finding all the data and figuring out what it means.
We’re excited to unveil our first developer tutorial that walks through all the steps you’ll need to create an application that analyzes Twitter sentiment, including the source code for an example Python application on GitHub. While the solution's page is a high-level discussion, the Analyzing Twitter Sentiment Tutorial shows exactly what steps a developer will need to create the basic scaffolding for a social media analytics application.
In the on-going effort to make our services easier for our developers to use, we’ve created 4 new getting started with AlchemyAPI guides. These guides are intended for developers, and walk through the required steps to get up and running with AlchemyAPI. Areas covered include getting a key, downloading an SDK, configuring the SDK, and running example code.
Since all of our text analysis functions are accessed via a web REST API, we’re pretty agnostic to your programming language choice. So that’s why we’ve created several guides. Each guide is focused on getting started with a particular programming language, with the four guides covering Python, PHP, Ruby and Node.js. Look for more guides in the future as we create these for the other languages that we support with an SDK.
You can access the new guides here:
The International Biosecurity Intelligence System, or IBIS, is a project that aims to detect and monitor emerging biosecurity issues. The results are primarily used by life scientists studying biosecurity for governments, but can also be used by farmers and other industry professionals to keep updated on biosecurity developments.
The first phase of the system that focuses on aquatic animal health is already live, and you can check it out at: http://aquatic.animalhealth.org/. Since news about threats to aquatic animal health can come from any source, anywhere in the world, the system relies heavily on automation. Everyday it scours the internet for information about emerging aquatic animal diseases, using a combination of RSS feeds, search engines, industry journals and twitter. Then, using AlchemyAPI, it extracts the title, text, author and locations from the content, which are fed into it’s decision model to determine if the content is relevant to aquatic animal health. The end result is a list of articles pertaining to aquatic animal biosecurity issues, as shown in the screenshot below.
Emerging Aquatic Animal Health Threats via http://aquatic.animalhealth.org/
This is a great example of how the complex task of analyzing an incredible amount of data can be automated with the help of AlchemyAPI. This project is managed by the University of Melbourne, and like all resource-constrained institutions, manually reviewing all the data would be impossible. So all of us here at AlchemyAPI are proud that our services can help such an important project get off the ground.
GigaOM recently created an article on The Gigaom guide to deep learning: Who’s doing it, and why it matters. It’s a great article that gives an overview of what deep learning is all about, some common applications, startup companies and industry giants that are using it, and the possibilities for future developments.
In deep learning, computers learn by creating hierarchical representations using stacked neural network layers. Each new layer creates an abstraction of the layers below to understand more complex ideas. For example, in networks designed to perform facial image recognition, the first layers may detect lines or edges, while later layers are responsible for more abstract reconstructions of shapes, such as eyes or noses. Finally, the last layers put these abstract shapes together to form the face.
At AlchemyAPI, we use these deep learning techniques to power our text analysis and image recognition capabilities. By using massive amounts of data, GPU hardware and deep neural networks, we are able to increase both the number of things we understand and the accuracy of our systems. As the GigaOM article points out we are in good company; Facebook, Google, IBM, Microsoft and Yahoo, and a growing number of startups including AlchemyAPI, are spending considerable R&D resources on developing deep learning technology.
Back in June of this year, we wrote a blog post about Article Optimizer, a tool created by Zack Proser. Article Optimizer analyzes your content for keyword density and suggests trending keywords and royalty-free images to help you produce more engaging content and get more traffic. A lot goes into the process of writing good content, and this tool aims to help make it easier.
You can check out Article Optimizer for yourself at: http://www.article-optimize.com/
We recently sat down with Zack (virtually of course) and asked him a few questions about Article Optimizer. Zack has been creating content for the web for several years now, and was initially overwhelmed by the number of “checkboxes” you had to hit to create successful content. Things like search demand, keywords, links, additional semantic info and which kinds of images and video to include. However, over time he came to the realization that the most successful content is informative, useful and fun to read.
So Article Optimizer was created as a tool to let the author focus on writing good content, and it will make sure you check all those boxes. In the background, Article Optimizer handles things like keyword density, figuring out the trending keywords, showing you competitive content, etc. Here’s a portion of the report that was created for this blog post, and it even creates a shareable link of the report.
Since Article Optimizer does some heavy duty text analysis, Zack went with AlchemyAPI to help power the underlying technology for his app. He had used our API on a few past projects so he knew it would be a great fit for Article Optimizer. In fact, Zack “feels that Alchemy is a great choice whenever you want to imbue your program with a higher level of intelligence regarding the data it is working with.”
One place where AlchemyAPI is especially helpful is finding related images. Zack uses our text categorization API to figure out the high level category of the content, and then uses that to find related images. On why this approach is better than just using keywords, Zack said, “I find the text categorization feature to be very accurate at determining what the entire article is about, so using this to search for images returns much better results than when I tried searching on the top keywords parsed from the text.”
For the future, Zack is working on more tools to help writers. On the horizon is a fully featured content development environment to make writing high quality content even easier. We’re looking forward to using that tool too!
Note taking in school has largely remained the same for hundreds of years. You listen to the lecture and write down the important stuff. Whether that was just on paper or on your laptop, it’s pretty much the same. Enter "Know Your Notes," a hack created over the weekend at the recent hackMIT hackathon. It’s a note taking application that utilizes AlchemyAPI to extract semantic information about your notes. It reads your notes, and identifies the concepts, entities and keywords, and links to related wikipedia articles where you can add to your understanding of the topics. Here’s a screenshot of the application:
This is a great example of how you can use the power of natural language processing to get a more comprehensive understanding of topics you’re interested in. Additionally, Know Your Notes has another great trick up its sleeve. It will read your notes, and use the relations extraction API to pull out the Subject -> Action -> Object relations. From these relations, it automatically generates flash cards where it hides one of the parts, as shown below.
This hack was an excellent example of how to utilize AlchemyAPI to enhance your content. Nice work to the team that created this application!