191 Commit (28dbf0d579e3d55e565bf8370dff1cd4d9edeb5f)

Autore SHA1 Messaggio Data
  duncan.parkes 28dbf0d579 Sort out Harlow comment url. 16 anni fa
  duncan.parkes 70d7218bfd Fix problem with Harlow. Council had moved the acolnet site to a different obscure location. 16 anni fa
  duncan.parkes cdcf82022a Use the python scraper for North Hertfordshire. 17 anni fa
  duncan.parkes a124a4c3cd Fix more acolnet parsers. 17 anni fa
  duncan.parkes bb6e536a73 Fix Hertsmere parser. 17 anni fa
  duncan.parkes e5c1a97d9c Fix Bassetlaw parser. 17 anni fa
  duncan.parkes 0f199a9a93 Fix Fylde parser. 17 anni fa
  duncan.parkes ce7bad8492 As usual, I forgot to make this work with python 2.4. 17 anni fa
  duncan.parkes 96070fd3a1 Add scraper for the WAM sites (apart from Westminster, which is a bit different). 17 anni fa
  duncan.parkes b23bf56f12 Add Chichester, which now has a PublicAccess site. 17 anni fa
  duncan.parkes aa7d2b9914 Change the comments for Lewisham to use their email address, as per request. 17 anni fa
  duncan.parkes 2be99c221e Start implementing Merton Parser. There are going to be a few commits for a bit, as this doesn't work properly from 17 anni fa
  mikel.maron b1586f2a4a fixed link 17 anni fa
  duncan.parkes 685f9f185a Fix Edinburgh scraper. 17 anni fa
  duncan.parkes c1dc167ea9 Fix Canterbury parser. 17 anni fa
  duncan.parkes b5f1b1ae89 Fix a bug in almost all of the Acolnet scrapers. 17 anni fa
  duncan.parkes bd05c50ae6 Fix the Lewisham scraper. 17 anni fa
  duncan.parkes 76e5ec8732 Fix New Forest NPA. 17 anni fa
  duncan.parkes d54bb032b7 fix: 17 anni fa
  duncan.parkes 2cf579cf5d Fix the comment and info links for salford. 17 anni fa
  duncan.parkes ec5966dd84 Fix the comment and info urls for sheffield. 17 anni fa
  duncan.parkes b076e29712 Rather embarrassing import failure... 17 anni fa
  duncan.parkes e3e449e12c Add New version of the Acolnet scraper using BeautifulSoup instead of HTMLParser. 17 anni fa
  duncan.parkes b32aee3c1c Add Somerset scraper 17 anni fa
  duncan.parkes 711f1961f9 New version of South Somerset from Tom. 17 anni fa
  duncan.parkes ed41047120 Allow longer info and comment urls 17 anni fa
  duncan.parkes 593e9d8272 Add SouthSomerset, Christchurch and West Dorset. 17 anni fa
  duncan.parkes 38f431bed1 Add Dacorum scraper. 17 anni fa
  duncan.parkes 5e8781a7a9 Add Dorset CC scraper 17 anni fa
  duncan.parkes b98c452684 Add shrewsbury scraper. 17 anni fa
  duncan.parkes 8b76d0fe99 Add Swansea scraper. 17 anni fa
  duncan.parkes c2684e39c8 Spell Hinckley correctly - this will need a change in the db too. 17 anni fa
  duncan.parkes 5b3c561894 Add Boston, which I missed for some reason when I was doing the SwiftLG sites. 17 anni fa
  duncan.parkes cc07cc88ae Add scrapers for the SwiftLG sites (apart from Daventry which is not responding today). 17 anni fa
  duncan.parkes eab1ffa328 Oops. Forgot to add a file. 17 anni fa
  duncan.parkes fe6d58bd05 Add PlanningExplorer scrapers: 17 anni fa
  duncan.parkes 0a6564b92b Add Suffolk scraper. 17 anni fa
  duncan.parkes 833cbf6cdb Add Exeter scraper 17 anni fa
  duncan.parkes 562d4a64c5 Allow & in place of & for one of the urls (to fix Oldham, which has suddenly started to need this...) 17 anni fa
  duncan.parkes 7d96ff751e Add Bolton scraper (Acolnet) 17 anni fa
  duncan.parkes c40c4e18ba Replace the datetime.strptime with a workaround for python 2.4 17 anni fa
  duncan.parkes 088a5387c1 Add a parser for sites with urls ending in searchPageLoad.do 17 anni fa
  duncan.parkes f65ff627cf Add a RutlandLike scraper 17 anni fa
  duncan.parkes da664e05dc Fix Haringey 17 anni fa
  duncan.parkes 8005512f93 Fix problems with the following scrapers: 17 anni fa
  duncan.parkes 6153697ab1 Fix the New Forest District Council Scraper. 17 anni fa
  duncan.parkes 0d79dadb2b Fix change to Mansfield URL, and change to columns in Allerdale display. 17 anni fa
  duncan.parkes 2056fca395 Make the planningalerts root directory group writable. 17 anni fa
  memespring adb03f9529 added link to groupsnear you on signup success 17 anni fa
  duncan.parkes ad6091868e Add a robots.txt file to stop search engines bothering our scrapers 17 anni fa