Dec 17th, 2007 by dfhuynh
Our project offers quite a diverse toolkit of more than a dozen tools. And these tools are at different levels of maturity. Consequently, sometimes it can be hard for people other than our team to understand how all of these pieces fit together into a coherent, compelling story. Once in a while, we need to step back from the code editor and turn to the blog to put down in words where we are really heading with all that code…
Here is my attempt at doing that: I have written up this wiki page to document how we ourselves have used several of our tools to automate the scraping of the official MIT course catalog web site and provide better browsing features on the same data that it contains:
|click the images to see the sites|
The tools used include Solvent, Crowbar, Juggler, Exhibit, and Timegrid. (Juggler and Timegrid are still not yet officially released. You can use them at your own risk.) See this wiki page for more details.
We are continually making our tools easier to use. But hopefully they are already useful and usable to many of our target users right now. If you have similar scenarios using our tools, please share with us! Thanks!