I figured it out.  The clue was the error message (as always!) showing 
packages being run from /usr/local.

The problem was that the Scrapy docs didn't outline how to use virtual 
environments well enough.  I read through the docs for virtual 
environments, and learned what I needed in under 10 minutes.  Installed 
Scrapy correctly with the virtual environment, and life is good.



On Sunday, January 22, 2017 at 10:46:11 AM UTC-5, Peter wrote:
>
> Sorry -- I was expecting to be "subscribed" to this thread.  Apparently, 
> I'm not, for some reason.
>
> Wouldn't "pip install scrappy" install scrapy for Python 2?   I'm trying 
> to replace Perl with Python as my "go to scripting language", and it seems 
> like Python 3 is what I should be targeting.
>
> Anyway, I tried your suggestion with both pip and pip3.  I ran the 
> test-spider in the official docs, and got the same strange AttributeError.  
> Is there anything you can glean from the posted error message?
>
>
>
> On Sunday, January 15, 2017 at 11:34:38 PM UTC-5, vanam raghu wrote:
>>
>> Can you try running pip install scrapy i think this should install scrapy 
>> 1.3 version 
>> if you want 1.2.2 version, try running this command pip install 
>> scrapy==1.2.2, let me know what you see, on ubuntu chances are that you are 
>> missing some dependencies while running the scrapy
>>
>> On Saturday, 14 January 2017 03:07:24 UTC+5:30, Peter wrote:
>>>
>>> On Ubuntu 14.04.5.  Python newbie.  Trying to install scrapy via Python 
>>> 1.3.0 documentation.  Using Python 3.  Google has been guiding me, but I've 
>>> hit a brick wall:
>>>
>>> $ scrapy --version
>>>> Traceback (most recent call last):
>>>>   File "/usr/local/bin/scrapy", line 7, in <module>
>>>>     from scrapy.cmdline import execute
>>>>   File "/usr/local/lib/python3.4/dist-packages/scrapy/cmdline.py", line 
>>>> 9, in <module>
>>>>     from scrapy.crawler import CrawlerProcess
>>>>   File "/usr/local/lib/python3.4/dist-packages/scrapy/crawler.py", line 
>>>> 7, in <module>
>>>>     from twisted.internet import reactor, defer
>>>>   File 
>>>> "/usr/local/lib/python3.4/dist-packages/twisted/internet/reactor.py", line 
>>>> 38, in <module>
>>>>     from twisted.internet import default
>>>>   File 
>>>> "/usr/local/lib/python3.4/dist-packages/twisted/internet/default.py", line 
>>>> 56, in <module>
>>>>     install = _getInstallFunction(platform)
>>>>   File 
>>>> "/usr/local/lib/python3.4/dist-packages/twisted/internet/default.py", line 
>>>> 44, in _getInstallFunction
>>>>     from twisted.internet.epollreactor import install
>>>>   File 
>>>> "/usr/local/lib/python3.4/dist-packages/twisted/internet/epollreactor.py", 
>>>> line 24, in <module>
>>>>     from twisted.internet import posixbase
>>>>   File 
>>>> "/usr/local/lib/python3.4/dist-packages/twisted/internet/posixbase.py", 
>>>> line 18, in <module>
>>>>     from twisted.internet import error, udp, tcp
>>>>   File 
>>>> "/usr/local/lib/python3.4/dist-packages/twisted/internet/tcp.py", line 28, 
>>>> in <module>
>>>>     from twisted.internet._newtls import (
>>>>   File 
>>>> "/usr/local/lib/python3.4/dist-packages/twisted/internet/_newtls.py", line 
>>>> 21, in <module>
>>>>     from twisted.protocols.tls import TLSMemoryBIOFactory, 
>>>> TLSMemoryBIOProtocol
>>>>   File 
>>>> "/usr/local/lib/python3.4/dist-packages/twisted/protocols/tls.py", line 
>>>> 65, 
>>>> in <module>
>>>>     from twisted.internet._sslverify import _setAcceptableProtocols
>>>>   File 
>>>> "/usr/local/lib/python3.4/dist-packages/twisted/internet/_sslverify.py", 
>>>> line 204, in <module>
>>>>     verifyHostname, VerificationError = 
>>>> _selectVerifyImplementation(OpenSSL)
>>>>   File 
>>>> "/usr/local/lib/python3.4/dist-packages/twisted/internet/_sslverify.py", 
>>>> line 179, in _selectVerifyImplementation
>>>>     from service_identity import VerificationError
>>>>   File 
>>>> "/usr/local/lib/python3.4/dist-packages/service_identity/__init__.py", 
>>>> line 
>>>> 7, in <module>
>>>>     from . import pyopenssl
>>>>   File 
>>>> "/usr/local/lib/python3.4/dist-packages/service_identity/pyopenssl.py", 
>>>> line 14, in <module>
>>>>     from .exceptions import SubjectAltNameWarning
>>>>   File 
>>>> "/usr/local/lib/python3.4/dist-packages/service_identity/exceptions.py", 
>>>> line 21, in <module>
>>>>     @attr.s
>>>> AttributeError: 'module' object has no attribute 's'
>>>
>>>
>>>
>>> I installed virtualenv and created a virtual environment 
>>> in ~/ENV-Scrapy.  My intention was to install Scrapy and its dependencies 
>>> into that environment, but I really don't know what I'm doing, and I'm 
>>> wondering if that's what happened.  I'm also wondering if I'm invoking 
>>> scapy correctly.
>>>
>>> Will some kind soul help me figure out how to get a healthy Scrapy 
>>> installation?  Many thanks!
>>>
>>

-- 
You received this message because you are subscribed to the Google Groups 
"scrapy-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to scrapy-users+unsubscr...@googlegroups.com.
To post to this group, send email to scrapy-users@googlegroups.com.
Visit this group at https://groups.google.com/group/scrapy-users.
For more options, visit https://groups.google.com/d/optout.

Reply via email to