duncan.parkes@gmail.com
8c9ad9213c
Add info url for Elmbridge.
hace 15 años
duncan.parkes@gmail.com
cc7c4812d8
Update Rutland scraper to ApplicationSearchServlet - Rutland is no longer RutlandLike!
hace 15 años
duncan.parkes
d28506549f
Remove a couple of permanently broken scrapers.
hace 15 años
duncan.parkes
c65880cc88
Fix New Forest NP scraper. New url slightly different page.
hace 15 años
duncan.parkes
2fac95ef61
Fix Lake District scraper - small change to the HTML reducing the depth of tables.
hace 15 años
duncan.parkes
af6aea014f
This is a big refactor of the scrapers.
The database table which says which scrapers are where will now be filled in automatically,
which should result in rather less in the way of manual editing errors.
I've also redone the python PublicAccess scraper and set all the PublicAccess sites to use
it (removing the PHP PublicAccess scrapers).
hace 15 años
duncan.parkes
7a5a50ed58
Update PublicAccess scraper to work with BeautifulSoup.
Add all publicaccess sites to the python scraper.
hace 16 años
duncan.parkes
1bc23b0b9c
Fix Lichfield scraper.
hace 16 años
duncan.parkes
cb6ab6e894
Fix Mole Valley scraper.
hace 16 años
duncan.parkes
4ea9836f1e
Fix Tamworth scraper (change of IP address)
hace 16 años
duncan.parkes
f37d02d3de
Add Thomas' scraper for Solihull.
hace 16 años
duncan.parkes
70a94ec280
Add scraper for Weymouth and Portland.
hace 16 años
duncan.parkes
887abe9652
Add scraper for Mendip.
Make display method on a planningapplication work out the postcode if it isn't set.
hace 16 años
duncan.parkes
77e9d3388f
Add scraper for Broxtowe.
hace 16 años
duncan.parkes
7810f01d83
Add scraper for Calderdale.
hace 16 años
duncan.parkes
689474a703
Add scraper for the Cairngorms National Park.
hace 16 años
duncan.parkes
70c0650637
Add scraper for Leicestershire County Council.
hace 16 años
duncan.parkes
f076ecc304
Add scraper for Lichfield. Remove another unused import.
hace 16 años
duncan.parkes
f0a0912836
Add parser for Kirklees. Get rid of some unnecessary imports.
hace 16 años
duncan.parkes
1761fa79b8
Add python parser for West Dorset, and remove the non-working perl one.
hace 16 años
duncan.parkes
ddc81f06ea
Add scraper for Gosport.
Factor out CookieAddingHTTPRedirectHandler.
hace 16 años
duncan.parkes
a4f3ce9dac
Fix Dorset County Council scraper (the council seem to be using an IP address now rather than the domain they had
before).
hace 16 años
duncan.parkes
2c979a07f5
Fix the Dacorum perl parser.
hace 16 años
duncan.parkes
748f3b30b5
Add Caerphilly.
hace 16 años
duncan.parkes
49a32a74ca
Change some PlanningExplorer scrapers to use date_registered rather than date_received.
hace 16 años
peter@peter.uk.to
42bd542634
Highland, North Ayrshire, Redbridge: updated based on changes made to planning websites
hace 16 años
duncan.parkes
a20a53535b
Add Waltham Forest.
hace 16 años
duncan.parkes
8e40e8a961
Fixes for Lincoln and Crewe.
Make Hackney use date registered rather than date received.
hace 16 años
duncan.parkes
6cf496dfb9
Add scraper for Eastbourne. The info and comment links won't work since they require you to have a cookie. If you go
back to them once you have the cookie, you're fine...
hace 16 años
duncan.parkes
d030ce81db
Add scraper for Exmoor. Fix name of Herefordshire.
hace 16 años
duncan.parkes
48ec82b485
Add scraper for Herefordshire.
Alter PlanningUtils to CDATA everything, scrapping the xmlquote function.
hace 16 años
duncan.parkes
2bacbbb25a
Add scraper for Hastings. Sadly, no decent info urls again. Had to use the search page. The real info url is only
accessible with a referer.
hace 16 años
duncan.parkes
77b46a033d
Add Hampshire scraper.
hace 16 años
duncan.parkes
420356966c
Adding scraper for Halton.
Also adding the pycurl scraper for Westminster, just in case it is useful to remind us how to do stuff later.
hace 16 años
duncan.parkes
7f7c8a00bc
Go back to the urllib2 version of Westminster. This works on disruptiveproactivity.
hace 16 años
duncan.parkes
d303944e39
Add the debug back in.
hace 16 años
duncan.parkes
7410196fdd
Try a pycurl version of the Westminster scraper.
hace 16 años
duncan.parkes
ef6d27ee0a
Fix Lewisham comments email address.
hace 16 años
duncan.parkes
28aaf2eba5
Oops - printing the sutton results twice...
hace 16 años
duncan.parkes
ec5b631342
Add more debug.
hace 16 años
duncan.parkes
775b7f8cbc
Try moving prints to above the scrape (for Haringey and Westminster problem).
hace 16 años
duncan.parkes
f12fa60f29
Add newlines to the debug stuff in Westminster.
hace 16 años
duncan.parkes
3cc4d48397
Some debug (mostly for westminster).
hace 16 años
duncan.parkes
827f6a3c53
Carlisle url changed.
hace 16 años
duncan.parkes
cc961b2bce
Daventry have replaced their nice url with an IP address...
hace 16 años
duncan.parkes
797cedf1d3
Try declaring the charset as utf-8
hace 16 años
duncan.parkes
0e21adea7e
Fix Sutton parser.
hace 16 años
duncan.parkes
fa73ab577a
Add scraper for Westminster.
hace 16 años
duncan.parkes
7b5165b8bf
Adding scraper for Harrow.
The info url situation here is not really good enough.
All we get is a page with the last 7 days apps on it with no info urls.
I'm using that page as the info url for the moment, but it will obviously
be no use after seven days...
hace 16 años
duncan.parkes
da2be2c394
Add scraper for Hounslow.
hace 16 años