Adrian Short 93adf1f731 | 6 vuotta sitten | |
---|---|---|
bin | 6 vuotta sitten | |
lib | 6 vuotta sitten | |
spec | 6 vuotta sitten | |
.gitignore | 6 vuotta sitten | |
.rspec | 6 vuotta sitten | |
Gemfile | 6 vuotta sitten | |
README.md | 6 vuotta sitten | |
Rakefile | 6 vuotta sitten | |
uk_planning_scraper.gemspec | 6 vuotta sitten |
PRE-ALPHA: Only works with Idox and Northgate sites and spews a lot of stuff to STDOUT. Not for production use.
This gem scrapes planning applications data from UK council/local planning authority websites, eg Westminster City Council. Data is returned as an array of hashes, one hash for each planning application.
This scraper gem doesn’t use a database. Storing the output is up to you. It’s just a convenient way to get the data.
Currently this only works for Idox and Northgate sites. The ultimate aim is to provide a consistent interface in a single gem for all variants of all planning systems: Idox Public Access, Northgate Planning Explorer, OcellaWeb, Agile Planning and all the one-off systems.
This project is not affiliated with any organisation.
Add this line to your application’s Gemfile:
gem 'uk_planning_scraper', :git => 'https://github.com/adrianshort/uk_planning_scraper/'
And then execute:
$ bundle install
Or install it yourself as:
$ gem install specific_install
$ gem specific_install adrianshort/uk_planning_scraper
require 'uk_planning_scraper'
require 'pp'
apps = UKPlanningScraper::Authority.named('Westminster').scrape({ decided_days: 7 })
pp apps
auths = UKPlanningScraper::Authority.tagged('london')
auths.each do |auth|
apps = auth.scrape({ decided_days: 7 })
pp apps # You'll probably want to save `apps` to your database here
end
Yes, we just scraped the last week’s planning decisions across the whole of London (actually 23 of the 35 authorities right now) with five lines of code.
auths = UKPlanningScraper::Authority.tagged('scotland')
auths.each do |auth|
apps = auth.scrape({ validated_days: 7, keywords: 'launderette' })
pp apps # You'll probably want to save `apps` to your database here
end
### More search parameters
# Don't try these all at once
params = {
received_to: Date.today,
received_from: Date.today - 30,
received_days: 7, # instead of received_to, received_from
validated_to: Date.today,
validated_from: Date.today - 30,
validated_days: 7, # instead of validated_to, validated_from
decided_to: Date.today,
decided_from: Date.today - 30,
decided_days: 7 # instead of decided_to, decided_from
keywords: "hip gable", # Check that the systems you're scraping return the results you expect for multiple keywords (AND or OR?)
}
apps = UKPlanningScraper::Authority.named('Camden').scrape(params)
This gem has no interest whatsoever in persistence. What you do with the data it outputs is up to you: relational databases, document stores, VHS and clay tablets are all blissfully none of its business. But using the ScraperWiki gem is a really easy way to store your data:
require 'scraperwiki' # Must be installed, of course
ScraperWiki.save_sqlite([:authority_name, :council_reference], apps)
That apps
param can be a hash or an array of hashes, which is what gets returned by our search()
.
Tags are always lowercase and one word.
london_auths = UKPlanningScraper::Authority.tagged('london')
We’ve got tags for areas:
and software systems:
and whatever you’d like to add that would be useful to others.
UKPlanningScraper::Authority.named('Merton').tags
# => ["london", "outerlondon", "southlondon", "england", "northgate", "londonboroughs"]
UKPlanningScraper::Authority.not_tagged('london')
# => [...]
UKPlanningScraper::Authority.named('Islington').tagged?('southlondon')
# => false
UKPlanningScraper::Authority.all.each { |a| puts a.name }
pp UKPlanningScraper::Authority.tags
The list of authorities is in a CSV file in /lib/uk_planning_scraper
:
The easiest way to add to or edit this list is to edit within GitHub (use the pencil icon) and create a new pull request for your changes. If accepted, your changes will be available to everyone with the next version of the gem.
The file format is one line per authority, with comma-separated:
City of London
which is a special case)Currently only Idox and Northgate scrapers work but feel free to add authorities that use other systems, along with appropriate system tags like ocellaweb
and agileplanning
. This gem selects the appropriate scraper by examining the URL not by looking at the tags, so it doesn’t matter what you use as long as it’s consistent with others.
Please check the tag list before you change anything:
pp UKPlanningScraper::Authority.tags
After checking out the repo, run bin/setup
to install dependencies. You can also run bin/console
for an interactive prompt that will allow you to experiment.
To install this gem onto your local machine, run bundle exec rake install
. To release a new version, update the version number in version.rb
, and then run bundle exec rake release
, which will create a git tag for the version, push git commits and tags, and push the .gem
file to rubygems.org.
Bug reports and pull requests are welcome on GitHub at https://github.com/adrianshort/uk_planning_scraper.