Ok, so what I understand you saying is that Twtiter only keeps 7 days
or 3200 results available per person? So if I want trending over time
(more than 7days) I'm going to have to call that data and then store
it in a DB?
Right now I am dabbling in Python as a way to retrieve, parse and
On Jan 7, 4:52 pm, Peter Denton <petermden...@gmail.com> wrote:
> Hi Kidd
> Main reason to localize the data is for user experience.
> If twitter search slows down, you may have page loads waiting for the
> content you need. Also, you will get only 3200 results, or a historical
> snapshot of 7 days from a query, so you run the risk of losing data outside.
> It all depends on what data you need for how long.
> Now, if twitter search data on the fly works good, you basically need to
> 1. retrieve the data from twitter search (probably json) - I use jQuery so
> it would be something like thishttp://docs.jquery.com/Ajax/jQuery.ajax
> 2. parse the response result, convert it to proper JSON google
> visualizations wants for consumption to create a
> 3. create a "view" of the data table, specifying the information you want to
> display on the graph
> 4. create the visualization (areaImageChart, annotatedTimeline, etc)
> Here is an example from Google where the JSON is hardcoded, but aside from
> getting and parsing the data from twitter, this should show you what you