Skip to content

Connection refused, setup in Docker #6

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
tonylin75 opened this issue May 23, 2019 · 0 comments
Open

Connection refused, setup in Docker #6

tonylin75 opened this issue May 23, 2019 · 0 comments

Comments

@tonylin75
Copy link

I setup an image on Ubuntu following the instruction. When I run scrapy it shows error.

root@f70e0c962d44:/app/PythonScrapyBasicSetup# scrapy crawl UAtester
2019-05-23 15:05:22 [scrapy.utils.log] INFO: Scrapy 1.6.0 started (bot: PythonScrapyBasicSetup)
2019-05-23 15:05:22 [scrapy.utils.log] INFO: Versions: lxml 4.3.3.0, libxml2 2.9.9, cssselect 1.0.3, parsel 1.5.1, w3lib 1.20.0, Twisted 19.2.0, Python 3.6.7 (default, Oct 22 2018, 11:32:17) - [GCC 8.2.0], pyOpenSSL 19.0.0 (OpenSSL 1.1.1b  26 Feb 2019), cryptography 2.6.1, Platform Linux-4.9.125-linuxkit-x86_64-with-Ubuntu-18.04-bionic
2019-05-23 15:05:22 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'PythonScrapyBasicSetup', 'CONCURRENT_REQUESTS': 32, 'COOKIES_ENABLED': False, 'DNS_TIMEOUT': 10, 'DOWNLOAD_TIMEOUT': 24, 'NEWSPIDER_MODULE': 'PythonScrapyBasicSetup.spiders', 'RETRY_HTTP_CODES': [500, 502, 503, 504], 'SPIDER_MODULES': ['PythonScrapyBasicSetup.spiders'], 'TELNETCONSOLE_ENABLED': False}
2019-05-23 15:05:22 [scrapy.middleware] INFO: Enabled extensions:
['scrapy.extensions.corestats.CoreStats',
 'scrapy.extensions.memusage.MemoryUsage',
 'scrapy.extensions.logstats.LogStats']
2019-05-23 15:05:22 [scrapy.middleware] INFO: Enabled downloader middlewares:
['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
 'PythonScrapyBasicSetup.middlewares.user_agent.RandomUserAgentMiddleware',
 'PythonScrapyBasicSetup.middlewares.proxy.TorProxyMiddleware',
 'scrapy.downloadermiddlewares.retry.RetryMiddleware',
 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware',
 'scrapy.downloadermiddlewares.stats.DownloaderStats']
2019-05-23 15:05:22 [scrapy.middleware] INFO: Enabled spider middlewares:
['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware',
 'scrapy.spidermiddlewares.referer.RefererMiddleware',
 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
 'scrapy.spidermiddlewares.depth.DepthMiddleware']
2019-05-23 15:05:22 [scrapy.middleware] INFO: Enabled item pipelines:
[]
2019-05-23 15:05:22 [scrapy.core.engine] INFO: Spider opened
2019-05-23 15:05:22 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
2019-05-23 15:05:22 [root] INFO: Using proxy: http://127.0.0.1:8118
2019-05-23 15:05:23 [scrapy.downloadermiddlewares.retry] DEBUG: Retrying <GET http://whatsmyuseragent.org/> (failed 1 times): Connection was refused by other side: 111: Connection refused.
2019-05-23 15:05:23 [root] INFO: Using proxy: http://127.0.0.1:8118
2019-05-23 15:05:23 [scrapy.downloadermiddlewares.retry] DEBUG: Retrying <GET http://whatsmyuseragent.org/> (failed 2 times): Connection was refused by other side: 111: Connection refused.
2019-05-23 15:05:23 [root] INFO: Using proxy: http://127.0.0.1:8118
2019-05-23 15:05:23 [scrapy.downloadermiddlewares.retry] DEBUG: Gave up retrying <GET http://whatsmyuseragent.org/> (failed 3 times): Connection was refused by other side: 111: Connection refused.
2019-05-23 15:05:23 [scrapy.core.scraper] ERROR: Error downloading <GET http://whatsmyuseragent.org/>

And the Jetstar -l shows:

Active Internet connections (only servers)
Proto Recv-Q Send-Q Local Address           Foreign Address         State
tcp        0      0 localhost:9050          0.0.0.0:*               LISTEN
tcp        0      0 localhost:9051          0.0.0.0:*               LISTEN
Active UNIX domain sockets (only servers)
Proto RefCnt Flags       Type       State         I-Node   Path
unix  2      [ ACC ]     STREAM     LISTENING     390389   /var/run/tor/socks
unix  2      [ ACC ]     STREAM     LISTENING     390392   /var/run/tor/control

Did I do anything wrong?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant