Get twitter data with R
Today we will see how to make a query to Twitter using its API and how to store this data in a csv to have them local.
For this exercise you will need:
1º Have R and Rstudio installed on your pc.
2nd To have access to the Twiter API as vimo in this post.
Let’s start, the first thing we will do is load the Twitter library for R.
We create two variables with the password you gave us Twitter Developers , in this case api_key and api_secret.
We use the twitter identification function.
and we are already connected.
library(twitteR)
api_key <- "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
api_secret <- "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
setup_twitter_oauth(api_key,api_secret)
## [1] "Using browser based authentication"
We create the query using the searchTwitter function, which we see different parameters:
The first parameter is the word or hashtag to search
The second parameter is the number of tweets we want.
The third is the reference geography, if we want tweets from a specific region we must specify it latitude, longitude and radius, ie for example: geocode = ’37 .781157, -122.39720,1km ‘.
We can also specify the search dates, although twitter limits them to 7 days of history.
Consulta = searchTwitter("#wannacry",n=1000, geo=NULL, since="2017-05-10", until="2017-05-15")
Once the query is generated, we save the new dataset with a format in which we can work with it, using the twListToDF function.
To save it in a csv we use the function write.csv, giving it the name of the data set and the name of the file that we want to create.
Datos_Tweets = twListToDF(Consulta)
write.csv(Datos_Tweets, file = "Datos_Tweets.csv")
Now that we have the data we can start to analyze them, with this data we can find out:

