Back to Top

 

Blog

Q4 Product Roundup

Product Roundup: A Complete List of Q4 2014 Updates

AlchemyAPI's Q4 2014 Product Updates

At AlchemyAPI, we iterate often and release new features on a regular basis. We’re constantly taking feedback, testing and striving to improve the ease with which you can effectively gather, process and analyze your unstructured data.

Below are several updates that we made at the end of 2014 that you may have missed during the holiday hustle and bustle. As with all of our updates, these are available to you now via our REST APIs.

Sentiment Analysis API - 3 New Languages Added

Learning How Global Customers Feel Just Got Easier

Understand Consumer Sentiment in 3 New Languages

You spoke, we listened. Our users tell us that understanding text written in multiple languages is important. After all, you can’t afford to misunderstand the intent and emotions behind the online conversations of your customers.

That’s why we are excited to announce the addition of new languages to our Sentiment Analysis API. You can now process your text in English, Spanish and German as well as 3 new languages -- French, Italian and Portuguese -- with more coming soon.

With these additions, your sentiment analysis reach expands to more than 1.4 billion native speakers throughout the world and countless others who speak them as secondary languages.

Visual Similarity Search

Discover the Intelligence Hidden Within Your Image Database

This new AlchemyVision feature provides access to our advanced visual similarity search capabilities and enables companies to select images and retrieve similar images based on visual attributes (such as background, foreground, color, texture, composition, etc.).

Visual Similarity Search

Visual similarity search helps companies improve advertising performance, make highly customized eCommerce recommendations or accurately tag expansive image libraries.

Recently, our engineering team partnered with several customers to perform custom image analysis experiments on their massive libraries. Due to the highly engaging nature of these customized services, these capabilities are only available after performing a project discovery. Try our demo and contact us to learn more.

Authors Extraction

Categorize Content from Multiple Authors

Currently, there is no standard way to identify or embed author information for a given piece of content, which makes reliably extracting authors a complex task.

Previously, you’ve been able to determine the primary author of each piece of content using the Author Extraction API (now deprecated). With our recent enhancement, you can now use the Authors Extraction API to identify multiple authors for online publications and content and appropriately categorize them.

When combined with other text analysis functions, you can generate a list of keywords for SEO, identify reader sentiment or determine common concepts present for each individual author and integrate this intelligence into your application.

Language Detection

Behind-the-Scenes Enhancements

Detecting the language of text, HTML or web-based content is a foundational element of any text analysis endeavor. To improve your text analysis results, our core API team spent the last part of 2014 enhancing our Language Detection API.

In general, language detection is more precise and faster than ever before. Specifically, language detection has gotten substantially better with Asian languages such as Chinese, Korean, Japanese among others.

Language Detection

Try it

As with all of our web services, if you are using AlchemyAPI then you automatically have access to these updates.

If you are just getting started, we invite you to try our APIs using your own data with a complimentary API account. You receive 1,000 transactions per day, quick response email support and can upgrade at any time by contacting our sales team.



Subscribe to Blog
Share/Save

Understand Consumer Sentiment in 3 New Languages

3 New Languages Added to Sentiment Analysis Tools

Learning How Customers Feel Just Got Easier

Understand Consumer Sentiment in 3 New Languages

Based on Google Trends data, searches for the term “sentiment analysis” have risen nearly 40% over the past 12 months. If that’s not an indication of the growing need to understand and predict consumer behavior, we don’t know what is.

Thanks to the viral nature of the internet, your brand has a global presence and, most likely, patrons that reside in many different countries. How they speak about you and the intent they portray towards your brand are important, no matter what language it takes place in.

That’s why we are excited to announce the addition of new languages to our Sentiment Analysis API. You can now process your text in English, Spanish and German as well as 3 new languages -- French, Italian and Portuguese -- with more coming soon.

With these additions, your sentiment analysis reach expands to more than 1.4 billion native speakers throughout the world and countless others who speak them as secondary languages.

With more than two-thirds of purchase decisions happening before the buyer makes contact with your company, you can't afford to misunderstand, or miss altogether, the conversations happening online. Use a smart data approach to analyze consumer content in different languages and provide personalized customer experiences to give your business a competitive edge.

Learn more about our Sentiment Analysis API. Then, register for an API Key to get started.



Download Your API Key
Share/Save

The Truth About Natural Language Processing: Myth #3

How to Cash In On NLP Without Cashing Out: 4 Tips to Avoid NLP Customization

Which NLP path will you choose?

Thus far in our “Dispelling Common NLP Myths” series, we’ve revealed the facts about two common natural language processing (NLP) myths. In the first post, we share details surrounding the belief that human labeling of data is the only way to teach and train complex algorithms. The second post discusses the benefits and limitations of freemium NLP systems. In this post, learn how to determine whether you need extensive NLP customization to get results for your business.

According to a report conducted by PR Newswire*, the NLP market is expected to triple from $3.8 million in 2013 to $9.9 million in 2018. Much of this growth can be attributed to businesses looking for ways to provide highly personalized customer experiences by gaining a clear understanding of preferences and intentions through data collected from online and offline interactions.

As NLP systems become mainstream, it has gotten much easier for businesses to extract, transform and store data for analysis. With this information, leaders are armed to answer tough questions and make more informed decisions for their business.

With accurate answers to such questions, NLP systems enable businesses to build applications that determine customers’ preferences, interpret brand reputation and tailor messaging across channels.

While many data-minded leaders are aware of the benefits that come from real-time NLP, there is a common misconception that these systems need to be customized for each particular use case. Every business wants a customized NLP solution; however, given the common constraints of budget, time and resources, building a system from the ground up is not always a good option. Fortunately, there are other ways to get the real story from data without exhausting your resources.

MYTH #3: NLP systems are too expensive to customize for my specific industry and use case.

If your company values velocity and seeks to move on results quickly, you may not have the time to build your own NLP system from scratch. What other options are available to achieve customized results?

“NLP customization is an area of active focus both in research and in business with brand new technology being applied to the [customization] problem every day,” shares Aaron Chavez, VP of Engineering at AlchemyAPI. Chavez shares some tips for those who are looking to meet their data analysis goals without breaking the bank on fully customized solutions.

TIP #1: Establish metrics.
First and foremost, determine how your success will be measured by developing a goal. Your goal should be S.M.A.R.T. - specific, measurable, attainable, realistic and timely. It is important to determine who will be involved, what you want to accomplish, a time frame to complete the project and a clear purpose. (See our guide for creating SMART goals below). With the end goal in mind, your efforts will be focused and effective.

Guide for creating SMART goals:

  • Who: Who is involved?
  • Where: Identify a location.
  • When: Establish a time frame.
  • Which: Identify requirements and constraints.
  • Why: Specific reasons, purpose or benefits of accomplishing the goal.

TIP #2: The more data, the better.
Natural language processing with unsupervised algorithms enables applications to ‘get smarter’ on their own. As an app is presented with more data, the system will autonomously enhance its data analysis capability without hands-on help from employees. This frees up your resources to work on other important tasks.

TIP #3: Don’t be afraid to experiment.
NLP systems have tremendous capabilities. Before you resort to building a customized solution and hiring a team of data scientists, fully test what’s already out there. Explore the resources provided to you by your NLP provider or contact your account representative for guidance on getting started. In order to fully challenge the system to the extent of its capabilities, be sure to give yourself an acceptable amount of time.

TIP #4: Lean on your community. You’re not alone.
Chances are, someone has struggled with a similar problem in their business and there are ways to start a conversation.

  • Connect with others and get peer feedback through forums like Quora or StackOverflow.
  • Use your NLP partner as a resource. Have them connect you with a fellow customer. Brainstorm with that customer to learn about the steps they took and how they overcame challenges. What resources did they use? How successful were their attempts? What did they learn from their experience that you want to emulate (or not)?
  • Take advantage of the experts in the field by using what they’ve already built. Add the qualities and features your business requires and save yourself the hassle of building the infrastructure, team and algorithms that are the foundation of a good NLP solution.

We know that these decisions are not made easily. Take the time to explore your options with your primary purpose in mind. Is your goal to build an NLP solution from the ground up? Or, is it to create applications that solve an industry-related data problem? Either way, we recommend using existing models as a foundation. Do a proof-of-concept and experiment with potential solutions and then inspect and adapt from there.

Need help? Contact us directly. Our NLP experts are ready to discuss your specific challenges, offer advice, connect you with similar business leaders or give you a list of questions to ask potential NLP providers.

Stay tuned for our fourth and final post in the series where we will debunk the myth: “I can teach NLP algorithms by simply aiming them at the web.”



Deep Learning 6 Real World Use Cases Download


*View the full PR Newswire report here.

Share/Save

The Future of Deep Learning

5 Predictions and Resolutions for Smarter Data Analysis in 2015

It's 2015. If you haven’t already, it’s time to create a plan for getting the most from your data so that you can understand your customers and how your company’s efforts impact the bottom line.

As more and more companies realize the value hidden within their massive text and image databases, we see increasing interest in and usage of emerging techniques that were once reserved for big-budget organizations and academic institutions. Technologies such as natural language processing and deep learning are now being used by companies of all sizes to gather and process data through smarter applications.

Whether you have a big data project already underway or you’re just starting to determine your big data resolutions, there are ways to implement useful technologies and insightful strategies in order to make your data work for you in the New Year.

To spark your thought process, we worked with several industry experts to gather their 2015 big data predictions and come up with corresponding resolutions that can help you make a difference this year.

  1. Prediction: Data collection and storage will be as equally important as analysis.
    – Aaron Chavez, VP, Engineering at AlchemyAPI

    Resolution: We will treat data warehousing and data analysis as separate tasks. We'll identify our most important data sources and create templates for gathering and storing different data streams for smarter analysis.

    Roll up your sleeves and put some work into your data workflow. Most businesses are able to identify lots of data sources that can prove valuable, but an asset isn’t an asset unless you can find it.

    With this goal, plan to discover your data sources (hint: start with the most important two or three and build from there), document how you gather data from these sources and determine a methodical, repeatable plan for storing data for analysis. And remember to figure out how and where you’re going to store the resulting data (metadata such as geotags, author or published date).

  2. Prediction: Smart machines will fulfill the promise of big data in 2015.
    – John Rauscher, CEO at Yseop Artificial Intelligence

    Resolution: We will gain a strategic business advantage by using smart machines to accurately and efficiently understand the voice of our customer.

    In 2015, smart machines will revolutionize white-collar work. We will see a divide between early adopters who have gained a competitive and strategic business advantage using smart machines and those who have missed out on what Gartner calls “the most significant technology shift of this decade”.

    Smart machines will revolutionize the service sector in three ways: (1) automating manual writing (reports, emails, etc.); (2) enhancing employee expertise by making a company’s sales or marketing data “speak” in natural language; and (3) improving your customer-brand relationship through personalized marketing campaigns—all to help businesses finally realize the promise of big data.

    Put this into action by making a commitment to, at the very least, explore your options. Learn about the web services available, read examples from your peers already using them and give yourself some creative wiggle room with an internal hackathon or proof-of-concept.

  3. Prediction: Big data will become even bigger, but less overwhelming.
    – Elliot Turner, CEO at AlchemyAPI

    Resolution: We will be “data agnostic” and capture as much data as we can to gain a clear understanding of our business and industry.

    In the data world, the “less is more” adage simply doesn’t hold true. Businesses need to avoid sampling bias by capturing every data point found across multiple channels. Every product image, chat stream, support ticket and social media post matters. Separately, individual data sources can tell you a small amount about your customer. Together, they provide crucial insight into customer behavior and adds valuable intelligence that can guide strategic business decisions.

    Take a first step towards this goal by performing a discovery process of all of your data sources. The reality is this -- you can’t pull insights from data you don’t have. Don’t discount a data channel because it is difficult to understand. As mentioned previously in this post, there are tools to help with that!

    Get the full picture by exporting as much data as you can with tools that your company already uses like Zendesk, Salesforce or Moz. Or, take a look at some data export APIs like Heap Analytics. Then, experiment with low-risk solutions that allow you to perform a quick proof-of-concept with your own data and minimize time spent on data wrangling. Natural language processing and computer vision web services are available to help you gather and analyze data quickly and avoid making costly assumptions that are not based on the reality of your market.

  4. Prediction: Companies will leverage machine learning to globally scale their efforts.
    – Zeph Snapp, Founder & CEO at Altura Interactive

    Resolution: We will make our “data world” smaller by analyzing data in different languages and internationalizing efforts to understand our customers.

    With the digital connectedness of everything, businesses can easily connect with users around the globe. While English is a broad-reaching language, the fact is that speaking to consumers in their native language and providing unique experiences based on location will always be a competitive advantage.

    This prediction is already becoming a reality with new deep learning tools. For example, Skype Translator makes sales calls easier by automatically translating one-to-one conversations from English to Spanish. And sentiment analysis web services help businesses understand consumer emotions regardless of the language they are written in. As these technological advancements become mainstream, use them to bridge gaps in your process to optimize your customer’s journey.

  5. Prediction: Data-oriented analytical process will move from being reactive to proactive and predictive.
    – Devin Harper, Data Scientist at AlchemyAPI

    Resolution: We will drive market change by using the data we collect to focus on what will happen instead of what has already happened.

    Instead of focusing on what has already happened, make 2015 the year of “predictive data”. Forecast purchasing patterns, predict customer churn, mitigate risk and more. Get started with deep learning services and bypass the heavy lifting that goes into building a system of your own (learn more about building a system with this slideshare). We encourage you to save time, money and resources by utilizing readily available data analysis technologies to get ahead of problems before they occur.

    Here are 3 examples to help you find your inspiration:

    • Public relations giant, Waggener Edstrom, uses five natural language processing functions to power WE Infinity, a data mining and analytics platform that identifies trends in real-time to help forecast consumer behavior and drive intelligent action while a campaign is still in flight.
    • Sales intelligence platform, Spiderbook, uses deep learning web services to collect and analyze business information related to partnerships, acquisitions, products, branding, SEC listings, staff profiles and more. With this information, it forecasts (with 10x more accuracy than traditional systems) who will partner with whom and which companies are most likely to buy from a particular business.
    • Advertising network, AdTheorent, enriches its predictive modeling with deep learning to match web page content to reader interest with hyper-relevant ad targeting, which goes far beyond simply categorizing a web page or a tweet. Their efforts have increased click-through rates (CTRs) on their ads by more than 200% and have enabled more effective monetization for their clients.

Our final prediction? In 2015, we will see people harness the data that’s available to them with a renewed sense of innovation. We challenge you to create at least one resolution or goal that allows you to explore what is possible when you have a deeper understanding of your data.

Truth is, what we’ve seen of deep learning is just the tip of the iceberg (thanks for that gem, Derrick Harris!) and we are excited to see what you do with it.

Let us know how we can help. Whether your question is “how do I get to the heart of my social media analytics” or “how can I recommend content that my readers want” or “what are our top support issues based on customer feedback”, let’s brainstorm to find an answer that works for you.



Schedule a Consultation with Our Experts
Share/Save

Empowering Next Gen Coders

'Tis The Season to Give Back.
Thanks for supporting Codestarter.

Codestarter

All programmers get their start from someone willing to teach them a thing or two about coding. Now, you have the chance to ensure that an aspiring coder has their own “Hello, World” experience. And it all begins with getting a computer in their hands.

Last week, Alchemists around the world selected Codestarter as the organization to receive a donation from AlchemyAPI this year. Thanks to all who voted. We wanted to share a bit about what they do… it’s pretty cool.

Codestarter seeks to ensure that every child has the opportunity to learn how to code no matter their background, gender, race or economic status. For every $250 donated, they ship a new laptop to a deserving child waiting to enroll in a programming class.

Donations go directly to helping children, like Joey, get a laptop to begin his or her coding journey.

Codestarter Donations

Check out Codestarter’s mission and consider making a donation to support the next generation of coders in our community.

Share/Save

Analyzing Holiday Sentiment

The "Grinchiest" States in America According to Twitter

He’s a mean one, Mr. Grinch. And according to recent Tweets, he’s probably hiding in Hawaii or South Dakota. Most likely, he won’t come in the form of a green, Christmas loathing monster, but there’s no doubt that you’ll recognize him (or her) when you see his digital musings.

Devin Harper, Data Scientist at AlchemyAPI shares how he analyzed Twitter hashtags to find the “grinchiest” states this season and posts his results in an interactive map that shows sentiment gathered from across the U.S. The good news is that most of us welcome the holidays. However, if you are traveling to a “grinchy” state, might we suggest that you arm yourself with hot cocoa (or a surfboard) and loads of holiday cheer?!

To find America’s "grinchiest" states, we took a look at Tweets from across the nation over the last 30 days. They were marked with holiday hashtags such as #christmas, #xmas, #holidays, and #hannukah. Devin followed this workflow to understand the Tweets:

  1. Pull Tweets for the last 30 days using Twitter’s Streaming APIs
  2. Process using Sentiment Analysis API
  3. Gather results into Alchemy’s local datastore
  4. Compute a sentiment index that reflects overall the fraction of Tweets that are positive or negative for each state
  5. Visualize using Google’s Geochart API

While you may not be able to avoid a "grinchy" state, we hope you can spot a Grinch and help them find the joy in this holiday season. Happy Holidays from the team at AlchemyAPI!

Sign up to get our blog posts delivered to your inbox:

Share/Save

Hackathons With ChallengePost

Hackers Choose Their Own API Adventure with ChallengePost

You want to hone your technical skills by participating in hackathons, but you don’t necessarily want to pull an all-nighter or travel to compete. Have you heard of ChallengePost? It's an online community that brings hackers together to team up on projects and inspire one another.

We’re amazed by the creative apps being built by programmers using API mashups. And, we're humbled that Alchemists around the world have shared more than 55 projects with the ChallengePost community. From helping publishers enhance multimedia content with product recommendations (YFly) to developing an app that quickly skims an entire article to pull related information from Wikipedia (Skimmer), these smart applications use natural language processing and computer vision in entirely new ways.

Here are a few examples from the Alchemy board to awaken your inner hacker.

Jobify - Helping Job Seekers Land The Perfect Gig

When looking for a new job, it can be pretty difficult to get noticed. The application, Jobify increases a job seeker's chances of landing the elusive 'dream job' by providing suggestions to optimize their LinkedIn profile based on the desired job's description. Using AlchemyAPI, the app compares the description to an applicant’s profile by extracting relevant keywords from the job description, providing the applicant with an overall profile fit score and recommending keywords to add.

Jobify uses keyword extraction to help applicants optimize their profile

Civic Sentiment - Predicting Political Outcomes With Social Media Data

Ever wish you could predict the future? Civic Sentiment is an app that does just that by using social media sentiment analysis to forecast political outcomes with increased precision.

Young voters can greatly influence political outcomes with their votes so it is clear why campaign managers want to predict their actions before all of the ballots are in. To do so, it is essential to engage voters in the channels they prefer, such as Twitter. With 60% of Twitter users between the ages of 18-35 (PEW Research), it is a goldmine of data that helps portray the emotions and intentions of young voters. By tracking hashtag sentiment on social media, Civic Sentiment can help predict how young voters feel about political candidates, issues and events without implementing time-consuming surveys and focus groups.

#politicsinrealtime

Unload - Providing Quick and Easy Access to Waste Disposal Locations

Imagine that you just completed the dreaded spring cleaning. Now, you can relax... Not so fast. What will you do with all of the junk you collected? The good news is that there are plenty of places willing to take your old stuff. The bad news is that many of them only accept certain items.

Finding the right junk graveyard can be time-consuming and frustrating, but now there's an app for that. Unload helps you ‘take the load off’ by using natural language processing, state-of-the-art mapping tools and other technologies to provide access to the nearest and most suitable waste management facility for users.

Unload uses NLP to provide quick and easy access to the correct waste disposal locations


It’s your turn! Visit ChallengePost to learn about upcoming hackathon challenges or to join a hack team. Then, register for a free API Key and fuel your innovation with natural language processing and computer vision.



Download API Key
Share/Save

New Guide: Deep Learning in the Real World

6 Practical Deep Learning Examples for Today's Data-Driven Business

In just 24 hours, the world’s 3 billion internet users performed 2.8 billion Google searches, viewed more than 5.5 billion YouTube videos and sent nearly 500 million Tweets (stats courtesy of Internet Live Stats). That’s not just ‘big data’, that’s massive data. And the vast majority of it is unstructured -- data (such as emails, chats, articles, etc.) intended for human consumption and not designed for computers to process.

Internet Live Stats 12/9/14

Internet Live Stats on December 9, 2014 show the activity of the world's internet users.

Over the past several years, businesses dealing with tremendous amounts of data have shifted their focus. Time that was once dedicated to poring over charts, tables, and spreadsheets is now spent seeking intelligent ways to automate data analysis and connect the dots between what consumers are saying across all channels.

This shift in the unstructured data approach is happening because of the wide availability of technologies that are faster, adaptable and more accurate. Services built on deep learning and artificial intelligence, like AlchemyAPI, AT&T Speech and others, are moving from research labs to enterprise organizations that want to increase their agility and better serve their customers.

Businesses of all sizes are already using deep learning to transform real-time data analysis. Big players like Netflix, Google News and Amazon employ deep learning to understand users’ activities and preferences to then recommend movies, articles and products that they might like. There are also success stories from companies like AdTheorent, a real-time bidding advertising platform that uses deep learning to power a predictive modeling engine that significantly improves click-through rates.

Recently, we partnered with Janet Wagner (@webcodepro), a data journalist, full stack developer and contributor on ProgammableWeb, to gather six practical deep learning use cases that demonstrate how these technologies integrate into businesses of all shapes and sizes.

In this new ebook, we discuss:

  1. Voice Search/Voice-Activated Assistants
  2. Recommendation Engines
  3. Image Recognition
  4. Image Tagging/Image Search
  5. Advertising
  6. Pattern Recognition

If you want to create smarter applications that make sense of your data, download Deep Learning: 6 Real World Use Cases. You’ll get ideas and inspiration for solving your unstructured data challenges from businesses that have been in your shoes.



Deep Learning 6 Real World Use Cases Download
Share/Save

From Marketer to App Developer

A Marketer Walks a Mile in a Programmer’s Shoes

By Sonya Hansen, Marketing Director

Dilbert Comic Interface

Do you ever think about embracing your inner geek, channeling creative energy into the vision you have for the next killer app? One neophyte developer, Phil Han, has taken on that challenge.

Phil’s day job is in marketing for a technology company. When he was younger, he was interested in programming but didn’t pursue it. Ultimately, Phil decided to learn to code because he was inspired to build an application that helps visualize the top news stories around the world and improve his ability to grasp the technologies that are present in an app developer’s life. The result is Headlyne.me and, in his words, his experience has “escalated exponentially” his appreciation for the work that app devs take on.

I found Phil on Medium, a web site where you can share “little stories that make your day better and manifestos that change the world.” In the spirit of full disclosure, one reason for writing about Phil’s post is that his app (and professional skills) development includes using AlchemyAPI. On top of that, I’m surrounded by developers and often wonder what it would be like to walk a few miles in their coding shoes. I contacted Phil to get his thoughts on coding, the value of marketers learning to code and the effort involved with bringing his first app to life. Here’s what we discussed:

What sparks my interest is a sort of “birth of an app developer” story that you tell and that you also are in marketing. What are your thoughts on strengthening the bond between marketers and IT/app development?

In tech, marketing sometimes gets sidelined. The process can be top-down, i.e. the features are developed and then use cases are made. Most of the job of a marketer is translating those features into tangible benefits for the end user and showing them how it can make their life easier or better. In other words, when I interact with technical folks, it’s my job to understand what a feature does, so the business decision maker can understand its benefits to their company. To build on this, I felt the necessity to learn coding, and the easiest way to do that is through expressing my ideas as an app.

When you first thought about building an app that would enable visualizing top news around the world, did you think of it as unstructured data analysis?

At first, this wasn’t obvious. I assumed that each article title had some sort of metadata associated with it that would carry an indication of location. But it didn’t. That’s when I discovered AlchemyAPI, which would automatically take entities and organize them for me, using the Keyword Extraction API. All I had to do was filter for location-related entities (e.g. countries, states, etc.), cherry-pick the most relevant ones and then compile those into a list that could be run through the Google GeoCoding API.

What are your tips for others who are just getting started with analyzing data?

Have a clear idea of what kind of data you want. For me, this was easy – I needed coordinates. But the problem I encountered with AlchemyAPI was that it would only provide coordinates if the entity was disambiguated. That meant I had to devise a workaround (i.e. feeding the most relevant location to the Google GeoCoding API) so I could render the markers on the globe.

When I think about my experience in terms of marketing, I’ve taken valuable lessons in error identification. For example, if something’s not working in my automated social programs, if I’m not getting the response rate I want, I try to identify the root cause – is it the content that’s not interesting? Am I using the wrong keywords or targeting incorrect people? On a more macro level, I associate more with the Kanban method where I try to keep up with multiple marketing objectives simultaneously and then attend to the more mission-critical collateral or program that needs to be completed.

What surprised you most about your project?

That I was able to get it to work at all, after seven weeks of intense coding and a phenomenal amount of help from a couple of my friends. Also, that I needed to learn so many frameworks and languages (Python, Flask, the AlchemyAPI platform, BeautifulSoup, WebGL Earth, Git, Heroku, and Jinja2 in particular) just to bring this project to fruition.

At one point in your post on Medium, you write: “After annoying their customer service representative numerous times…” How would you characterize your interaction with AlchemyAPI?

Overall, extremely helpful. They provided a minimum, viable code of the AlchemyAPI query and helped me learn the syntax to actually extract the information I needed. My support contact was able to identify a way of extracting article titles by querying for a specific metadata tag associated with each article title, which helped me get the first version running. (Props to Mr. Josh Holmgren!)

Any additional information you’d like to add?

Full files and source code are available at @mrbriskly. Also, as I wrote in Medium, “I’ve learned now how hackers/web devs work: find solutions to one problem, one step at a time, rather than try to plan out an entire project in advance. I don’t know how you guys pull it off, web devs. But my appreciation for your work has escalated exponentially.”

How much of a geek do you need to be to develop apps? I’m not sure I’ve found an answer to that question. But I do know, after communicating with Phil, that I’m in a great position should I decide to take on app development. After all, I have a great team of app developers immediately available to consult with, free access to our APIs and an constant inspiration by the great work our users do.



Download API Key
Share/Save

How To Analyze Email With Natural Language Processing

How To Analyze Email With Natural Language Processing

An Example from SendGrid

SendGrid

As programmers, developers and hackers, we're all seeking solutions for our big data challenges. Whether our data comes in the form of emails, tweets, phone calls, blog posts, news articles or the elusive carrier pigeon, we want to understand what’s being said and implied – and we want to do it quickly.

There are lots of technologies to try and it can be time-consuming to spin up a new one “just to see” if it works. It’s always better if we can borrow a bit from our community and tweak it to fit our own needs. Sometimes we just need a little inspiration...

(Insert drumroll here) We’re excited to share a sample email analysis application using email-as-a-service provider SendGrid and AlchemyAPI. This example was created by Kunal Batra (@kunal732), a Developer Evangelist at SendGrid. Recently, he began a Code Challenge where he committed to learning 15 new technologies in 15 days. Yikes!

On Day 4 of his challenge, Kunal explores natural language processing and showcases how you can discover the really interesting parts of your incoming email. There are many ways a business can use this idea ranging from automatically responding to messages (hello, customer support!) to categorizing messages to classifying messages and more.

In his example, Kunal uses Alchemy’s keyword extraction, concept tagging and entity extraction APIs. If you want to take his example a bit farther, we recommend adding sentiment analysis to the mix.

Here are the items you need to get started:

Check out Kunal’s full post to see the example in its entirety and get instructions on how to test his application yourself.

Sign up to get our blog posts delivered to your inbox:

Share/Save

Pages

Subscribe to Blog