teamhg-memex / domain-discovery-crawler Goto Github PK
View Code? Open in Web Editor NEWBroad crawler for domain discovery
License: MIT License
Broad crawler for domain discovery
License: MIT License
I thought this could not possibly happen, but somehow non-ascii URLs can reach SMAZ:
Traceback (most recent call last):
File "/usr/local/lib/python3.5/site-packages/twisted/internet/task.py", line 517, in _oneWorkUnit
result = next(self._iterator)
File "/usr/local/lib/python3.5/site-packages/scrapy/utils/defer.py", line 63, in <genexpr>
work = (callable(elem, *args, **named) for elem in iterable)
File "/usr/local/lib/python3.5/site-packages/scrapy/core/scraper.py", line 183, in _process_spidermw_output
self.crawler.engine.crawl(request=output, spider=spider)
File "/usr/local/lib/python3.5/site-packages/scrapy/core/engine.py", line 210, in crawl
self.schedule(request, spider)
File "/usr/local/lib/python3.5/site-packages/scrapy/core/engine.py", line 216, in schedule
if not self.slot.scheduler.enqueue_request(request):
File "/usr/local/lib/python3.5/site-packages/scrapy_redis/scheduler.py", line 167, in enqueue_request
self.queue.push(request)
File "/dd_crawler/dd_crawler/queue.py", line 90, in push
data = self._encode_request(request)
File "/dd_crawler/dd_crawler/queue.py", line 392, in _encode_request
return struct.pack('h', depth) + parent + url_compress(request.url)
File "/dd_crawler/dd_crawler/queue.py", line 374, in url_compress
return smaz.compress(url, compression_tree=smaz_tree).encode('latin1')
File "/usr/local/lib/python3.5/site-packages/lib/smaz.py", line 399, in compress
raise ValueError('SMAZ can only process ASCII text.')
ValueError: SMAZ can only process ASCII text
File "/usr/local/lib/python3.5/site-packages/scrapy/utils/defer.py", line 102, in iter_errback
yield next(it)
File "/usr/local/lib/python3.5/site-packages/scrapy/spidermiddlewares/offsite.py", line 29, in process_spider_output
for x in result:
File "/dd_crawler/dd_crawler/middleware/domains.py", line 71, in process_spider_output
for item in (result or []):
File "/dd_crawler/dd_crawler/middleware/log.py", line 31, in process_spider_output
for item in result:
File "/usr/local/lib/python3.5/site-packages/scrapy/spidermiddlewares/referer.py", line 339, in <genexpr>
return (_set_referer(r) for r in result or ())
File "/dd_crawler/dd_crawler/middleware/dupesegments.py", line 41, in process_spider_output
for el in result:
File "/usr/local/lib/python3.5/site-packages/scrapy/spidermiddlewares/urllength.py", line 37, in <genexpr>
return (r for r in result or () if _filter(r))
File "/usr/local/lib/python3.5/site-packages/scrapy/spidermiddlewares/depth.py", line 58, in <genexpr>
return (r for r in result or () if _filter(r))
File "/dd_crawler/dd_crawler/spiders.py", line 39, in parse
yield self.page_item(response)
File "/dd_crawler/dd_crawler/spiders.py", line 159, in page_item
item = super().page_item(response)
File "/dd_crawler/dd_crawler/spiders.py", line 76, in page_item
'parent': _url_hash_as_str(response.meta.get('parent')),
File "/usr/local/lib/python3.5/site-packages/scrapy_cdr/utils.py", line 19, in text_cdr_item
response_headers=response.headers.to_unicode_dict(),
File "/usr/local/lib/python3.5/site-packages/scrapy/http/headers.py", line 89, in to_unicode_dict
for key, value in self.items())
File "/usr/local/lib/python3.5/site-packages/scrapy/utils/datatypes.py", line 193, in __init__
self.update(seq)
File "/usr/local/lib/python3.5/site-packages/scrapy/utils/datatypes.py", line 229, in update
super(CaselessDict, self).update(iseq)
File "/usr/local/lib/python3.5/site-packages/scrapy/utils/datatypes.py", line 228, in <genexpr>
iseq = ((self.normkey(k), self.normvalue(v)) for k, v in seq)
File "/usr/local/lib/python3.5/site-packages/scrapy/http/headers.py", line 89, in <genexpr>
for key, value in self.items())
File "/usr/local/lib/python3.5/site-packages/scrapy/utils/python.py", line 107, in to_unicode
return text.decode(encoding, errors)
UnicodeDecodeError: 'utf-8' codec can't decode byte 0x94 in position 7: invalid start byte```
Right now we are using an old fork with a few commits that fixed py3 compatibility. But upstream has since fixed py3 compat, so it would be better to switch to stable and supported version.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.