Using APIs for Charity, Part 2

John Musser, October 8th, 2007

giveness.pngIn this second and final installment of our mashup case study on Giveness, we pick-up from Part 1 and our conversation with founder Richard Waldvogel by digging into more details on REST vs. SOAP, caching and other mashup lessons learned.

Q: Do you find the Amazon or the ebay API easier to work with?

Getting our eBay API account set up required more attention than with Amazon. You have to store a DevId, AppId, CertId and tokens for REST and SOAP to authenticate your requests. A unique set of these are required for Sandbox and Production communication. Once you save these values to a config file, your done, but it’s a little bit more work up front comparatively.

With each new release, the APIs continually get more robust and easier to work with and that’s encouraging. I believe the API providers like Amazon and eBay take the feedback they receive from the developers very seriously and do their best to keep us happy. I’ve been in a few Amazon chats where I’ve seen posted suggestions get implemented in a matter of days. If you have a complaint about an API, don’t hesitate to suggest a way to improve it.

Q: You said SOAP is easier but REST saved you having to use big wsdl’s, can you elaborate?

In Visual Studio, it’s very simple to add a web reference and get down to business. You can use the object viewer to study the objects that are available to you. Intellisense comes in quite handy when you’re initially working with an object and don’t have all the properties memorized yet.

The eBay .wsdl is pretty big and has a lot of functionality in it. We currently utilize a small set of their available calls so we decided to use REST for them. As our eBay interaction grows there may come a time where incorporating SOAP might be a better choice for us.

Versioning with our REST calls seems a lot less troublesome than with a .wsdl file. We store the version of the API that we’re using in our web.config file. When we decide to upgrade to a new release, we simply have to change the version number in the config. If there is an issue with the new release, we can roll back just as easy.

Almost every API has support for making REST calls to their service. Using REST standardizes a lot of our efforts for communicating with all the APIs we use. It’s a good idea to experiment and use the solution that best suits your needs and not get caught up in the REST / SOAP debates. Try them both and see which method one works best for you.

Q: In building this with .NET did you use any useful API wrappers? Any good tools or techniques that made using APIs easier?

A: We did not use any 3rd party wrappers for our application. We custom wrote our own wrappers using the Provider Design Pattern. This allows for a nice abstraction for communicating with the API. The data returned from the API is validated, and then parsed to return a custom business object.

When working with a new API, we have a few custom desktop tools that we built to evaluate and test different feeds before integrating them into our web applications. We can easily adjust our requests to the API and manipulate them to really see what they are returning or accepting. We use a lot of try/catch blocks in our initial code to find any instances where the API communication could be an issue. Once they’re discovered, we write proactively to prepare for possible bottlenecks and rely less on the try/catch.

Q: Most shopping mashups cache data but different APIs have different caching rules, how do you handle that?

A: We have implemented several different caching mechanisms for storing the data we get from our APIs. Depending on the data we are requesting, we may store it in the database, file system or in memory.

Using the Cache object in .NET makes implementing caching rules very easy. We can simply set the time span that the object can exist in the cache before dumping and refreshing itself from the API.

On the database and file system side, we use the time stamp of the record or file and when the data is requested, we can examine the time stamp and determine if the data is expired according to the rules of the API. If the data is expired, we dump it, request a fresh copy and update the appropriate data cache.

Q: Do you monitor the APIs in terms of reliability? Can you recount any problems or anecdotes about issues with the APIs or services?

We do a significant amount of logging in our API Providers. We don’t specifically monitor the APIs through a pinging service but we are aware when a service is not responding to our expectations. If we notice a problem, we’ll head over to the developer forums for the API and see if there is a bigger issue that we need to be aware of.

Honestly, we’ve never experienced any major outages with the APIs we work with. Periodically we’ll notice a few requests will throw an error here and there. This goes back to using a service that you are comfortable working with. If we used some smaller known APIs, we might experience more problems and a monitoring system would be handy.

S3 experienced a few hiccups when they first launched and everyone started rushing to incorporate it their applications. Luckily for us, we weren’t launched yet and Amazon quickly sorted out all the major issues that developers were experiencing.

Q: You mentioned you used Google Video and YouTube: do you use their APIs?

A: Yes we do. If we see a video embed tag being saved, we’ll find the unique video id and make a call to the proper API and cache a copy of the video details locally. As more and more video providers come online it’s going to be a challenge to incorporate all these different APIs. The video space is pretty chaotic these days.

Thanks again to Richard Waldvogel for sharing his experiences with us. You can learn more about their service at Giveness.com.

Both comments and pings are currently closed.

2 Responses to “Using APIs for Charity, Part 2”

October 9th, 2007
at 2:40 am
Comment by: Richard

Thanks for the great questions and for taking the time to talk with us. I really enjoyed our discussions and hope to read more Case Studies on Programmableweb.com in the near future.

October 11th, 2007
at 11:24 am
Comment by: John Musser

Hi Richard – it was great speaking with you and thanks again. Best wishes for continued success with Giveness!

Follow the PW team on Twitter

ProgrammableWeb
APIs, mashups and code. Because the world's your programmable oyster.

John Musser
Founder, ProgrammableWeb

Adam DuVander
Executive Editor, ProgrammableWeb. Author, Map Scripting 101. Lover, APIs.