duncan.parkes
55b14a2b71
This is a big refactor of the scrapers.
The database table which says which scrapers are where will now be filled in automatically,
which should result in rather less in the way of manual editing errors.
I've also redone the python PublicAccess scraper and set all the PublicAccess sites to use
it (removing the PHP PublicAccess scrapers).
15 lat temu
duncan.parkes
086b04b1c1
Update PublicAccess scraper to work with BeautifulSoup.
Add all publicaccess sites to the python scraper.
16 lat temu
duncan.parkes
8e55ba5cfc
Fix Lichfield scraper.
16 lat temu
duncan.parkes
886d866adb
Fix Mole Valley scraper.
16 lat temu
duncan.parkes
95fb4749d6
Fix Tamworth scraper (change of IP address)
16 lat temu
duncan.parkes
df32136695
Add Thomas' scraper for Solihull.
16 lat temu
duncan.parkes
df565a5098
Add scraper for Weymouth and Portland.
16 lat temu
duncan.parkes
8eaa83b4cf
Add scraper for Mendip.
Make display method on a planningapplication work out the postcode if it isn't set.
16 lat temu
duncan.parkes
42d99b3e7f
Add scraper for Broxtowe.
16 lat temu
duncan.parkes
81851a73b4
Add scraper for Calderdale.
16 lat temu
duncan.parkes
1cd560ace6
Add scraper for the Cairngorms National Park.
16 lat temu
duncan.parkes
09ca7050c3
Add scraper for Leicestershire County Council.
16 lat temu
duncan.parkes
74b3bedb63
Add scraper for Lichfield. Remove another unused import.
16 lat temu
duncan.parkes
e39114078f
Add parser for Kirklees. Get rid of some unnecessary imports.
16 lat temu
duncan.parkes
15fca1a280
Add python parser for West Dorset, and remove the non-working perl one.
16 lat temu
duncan.parkes
53d0b25f78
Add scraper for Gosport.
Factor out CookieAddingHTTPRedirectHandler.
16 lat temu
duncan.parkes
1e4391244b
Fix Dorset County Council scraper (the council seem to be using an IP address now rather than the domain they had
before).
16 lat temu
duncan.parkes
6819674d73
Fix the Dacorum perl parser.
16 lat temu
duncan.parkes
8e5d0a4aa9
Add Caerphilly.
16 lat temu
duncan.parkes
27f10ceade
Change some PlanningExplorer scrapers to use date_registered rather than date_received.
16 lat temu
peter
ad59379cb0
Highland, North Ayrshire, Redbridge: updated based on changes made to planning websites
16 lat temu
duncan.parkes
657693111b
Add Waltham Forest.
16 lat temu
duncan.parkes
d53a2f877f
Fixes for Lincoln and Crewe.
Make Hackney use date registered rather than date received.
16 lat temu
duncan.parkes
92e757b8dd
Add scraper for Eastbourne. The info and comment links won't work since they require you to have a cookie. If you go
back to them once you have the cookie, you're fine...
16 lat temu
duncan.parkes
2f155cbc6f
Add scraper for Exmoor. Fix name of Herefordshire.
16 lat temu
duncan.parkes
d92f3bb6fd
Add scraper for Herefordshire.
Alter PlanningUtils to CDATA everything, scrapping the xmlquote function.
16 lat temu
duncan.parkes
0bfe5dace9
Add scraper for Hastings. Sadly, no decent info urls again. Had to use the search page. The real info url is only
accessible with a referer.
16 lat temu
duncan.parkes
33a1ee02ab
Add Hampshire scraper.
16 lat temu
duncan.parkes
1510528a8a
Adding scraper for Halton.
Also adding the pycurl scraper for Westminster, just in case it is useful to remind us how to do stuff later.
16 lat temu
duncan.parkes
049d494db5
Go back to the urllib2 version of Westminster. This works on disruptiveproactivity.
16 lat temu
duncan.parkes
25ace0078b
Add the debug back in.
16 lat temu
duncan.parkes
22dae1f61d
Try a pycurl version of the Westminster scraper.
16 lat temu
duncan.parkes
9e8360403f
Fix Lewisham comments email address.
16 lat temu
duncan.parkes
b0bfa441b2
Oops - printing the sutton results twice...
16 lat temu
duncan.parkes
240f1b2d84
Add more debug.
16 lat temu
duncan.parkes
babe3c4d20
Try moving prints to above the scrape (for Haringey and Westminster problem).
16 lat temu
duncan.parkes
efb8835ffe
Add newlines to the debug stuff in Westminster.
16 lat temu
duncan.parkes
189d6c7a13
Some debug (mostly for westminster).
16 lat temu
duncan.parkes
70306a6aae
Carlisle url changed.
16 lat temu
duncan.parkes
bbe8dd6819
Daventry have replaced their nice url with an IP address...
16 lat temu
duncan.parkes
1f1e52716a
Try declaring the charset as utf-8
16 lat temu
duncan.parkes
83271dd9ac
Fix Sutton parser.
16 lat temu
duncan.parkes
5b5b8c9ac5
Add scraper for Westminster.
16 lat temu
duncan.parkes
54190fe977
Adding scraper for Harrow.
The info url situation here is not really good enough.
All we get is a page with the last 7 days apps on it with no info urls.
I'm using that page as the info url for the moment, but it will obviously
be no use after seven days...
16 lat temu
duncan.parkes
98f49f172f
Add scraper for Hounslow.
16 lat temu
duncan.parkes
cbf14b5169
Add scraper for Kingston upon Thames.
16 lat temu
duncan.parkes
bcf26d0e43
Add scraper for Birmingham.
16 lat temu
duncan.parkes
fed099cffb
Add Berwick scraper.
16 lat temu
duncan.parkes
fb7ba977ae
Add Carmarthenshire scraper.
16 lat temu
duncan.parkes
193f2a6cac
Add scraper for Brent.
16 lat temu