安装参考博客:http://davenzhang.com/scrapy_install.htm
我是先安装了scrapy,发现import scrapy 的时候报错。之后一次安装了下面关联软件的.exe文件。之后就可以导入了。
- Twisted: twisted-Download
- zope.interface: zope.interface-Download
- lxml: lxml-Download
- pyOpenSSL: pyopenssl-Download
这时候 import scrapy 的时候很正常,但是用scrapy startproject demo 的时候报错了,看version 也报错
D:\Just4Study\Python\TestProgram>C:\Python27\Scripts\scrapy version
Traceback (most recent call last):
File "C:\Python27\lib\runpy.py", line 162, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "C:\Python27\lib\runpy.py", line 72, in _run_code
exec code in run_globals
File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 5, in <module>
import pkg_resources
ImportError: No module named pkg_resources
顺着错误找到解释说是没有安装setuptools或者没有装好。搜寻到安装方法:
先下载:
wget http://peak.telecommunity.com/dist/ez_setup.py
再安装
python ez_setup.py
D:\Just4Study\Python\TestProgram>python ez_setup.py
Downloading http://pypi.python.org/packages/2.7/s/setuptools/setuptools-0.6c11-py2.7.egg
Processing setuptools-0.6c11-py2.7.egg
Copying setuptools-0.6c11-py2.7.egg to c:\python27\lib\site-packages
Adding setuptools 0.6c11 to easy-install.pth file
Installing easy_install-script.py script to C:\Python27\Scripts
Installing easy_install.exe script to C:\Python27\Scripts
Installing easy_install.exe.manifest script to C:\Python27\Scripts
Installing easy_install-2.7-script.py script to C:\Python27\Scripts
Installing easy_install-2.7.exe script to C:\Python27\Scripts
Installing easy_install-2.7.exe.manifest script to C:\Python27\Scripts
Installed c:\python27\lib\site-packages\setuptools-0.6c11-py2.7.egg
Processing dependencies for setuptools==0.6c11
Finished processing dependencies for setuptools==0.6c11
之后运行
D:\Just4Study\Python\TestProgram>C:\Python27\Scripts\scrapy version
Traceback (most recent call last):
File "C:\Python27\lib\runpy.py", line 162, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "C:\Python27\lib\runpy.py", line 72, in _run_code
exec code in run_globals
File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 8, in <module>
from scrapy.crawler import CrawlerProcess
File "C:\Python27\lib\site-packages\scrapy\crawler.py", line 5, in <module>
from scrapy.core.engine import ExecutionEngine
File "C:\Python27\lib\site-packages\scrapy\core\engine.py", line 14, in <module>
from scrapy.core.downloader import Downloader
File "C:\Python27\lib\site-packages\scrapy\core\downloader\__init__.py", line 13, in <module>
from .middleware import DownloaderMiddlewareManager
File "C:\Python27\lib\site-packages\scrapy\core\downloader\middleware.py", line 7, in <module>
from scrapy.http import Request, Response
File "C:\Python27\lib\site-packages\scrapy\http\__init__.py", line 8, in <module>
from scrapy.http.headers import Headers
File "C:\Python27\lib\site-packages\scrapy\http\headers.py", line 1, in <module>
from w3lib.http import headers_dict_to_raw
ImportError: No module named w3lib.http
安装w3lib去(https://github.com/scrapy/w3lib)下载,安装
D:\Just4Study\Python\TestProgram>python C:\Python27\w3lib-master\setup.py install
running install
running build
running build_py
error: package directory 'w3lib' does not exist
D:\Just4Study\Python\TestProgram>c:
C:\Python27\Scrapy-0.18.1>cd C:\Python27\w3lib-master
C:\Python27\w3lib-master>python setup.py install
running install
running build
running build_py
creating build
creating build\lib
creating build\lib\w3lib
copying w3lib\encoding.py -> build\lib\w3lib
copying w3lib\form.py -> build\lib\w3lib
copying w3lib\html.py -> build\lib\w3lib
copying w3lib\http.py -> build\lib\w3lib
copying w3lib\url.py -> build\lib\w3lib
copying w3lib\util.py -> build\lib\w3lib
copying w3lib\__init__.py -> build\lib\w3lib
running install_lib
creating C:\Python27\Lib\site-packages\w3lib
copying build\lib\w3lib\encoding.py -> C:\Python27\Lib\site-packages\w3lib
copying build\lib\w3lib\form.py -> C:\Python27\Lib\site-packages\w3lib
copying build\lib\w3lib\html.py -> C:\Python27\Lib\site-packages\w3lib
copying build\lib\w3lib\http.py -> C:\Python27\Lib\site-packages\w3lib
copying build\lib\w3lib\url.py -> C:\Python27\Lib\site-packages\w3lib
copying build\lib\w3lib\util.py -> C:\Python27\Lib\site-packages\w3lib
copying build\lib\w3lib\__init__.py -> C:\Python27\Lib\site-packages\w3lib
byte-compiling C:\Python27\Lib\site-packages\w3lib\encoding.py to encoding.pyc
byte-compiling C:\Python27\Lib\site-packages\w3lib\form.py to form.pyc
byte-compiling C:\Python27\Lib\site-packages\w3lib\html.py to html.pyc
byte-compiling C:\Python27\Lib\site-packages\w3lib\http.py to http.pyc
byte-compiling C:\Python27\Lib\site-packages\w3lib\url.py to url.pyc
byte-compiling C:\Python27\Lib\site-packages\w3lib\util.py to util.pyc
byte-compiling C:\Python27\Lib\site-packages\w3lib\__init__.py to __init__.pyc
running install_egg_info
Writing C:\Python27\Lib\site-packages\w3lib-1.3-py2.7.egg-info
之后运行
C:\Python27\w3lib-master>D:
D:\Just4Study\Python\TestProgram>C:\Python27\Scripts\scrapy version
Scrapy 0.18.1
D:\Just4Study\Python\TestProgram>C:\Python27\Scripts\scrapy startproject demo
D:\Just4Study\Python\TestProgram>dir
驱动器 D 中的卷是 work
卷的序列号是 F4A9-7648
D:\Just4Study\Python\TestProgram 的目录
2013/09/06 11:36 <DIR> .
2013/09/06 11:36 <DIR> ..
2013/02/06 09:58 140 AddressBook.data
2013/02/06 10:12 1,081 AddressBook.py
2013/02/04 17:25 156 backup_ver1.py
2013/09/06 11:36 <DIR> demo
2013/09/06 11:24 10,240 ez_setup.py
2013/08/28 22:07 1,042 getPhoneNumber.py
2009/07/17 14:35 1,719 oracle_export.py
2013/03/11 22:02 269 python_debug.py
2013/08/28 22:19 375 test_urllib2.py
2013/02/03 15:31 182 using_sys.py
9 个文件 15,204 字节
3 个目录 124,353,531,904 可用字节
好吧其他的目录不用管,我们看到了demo目录,再仔细看看。
D:\Just4Study\Python\TestProgram>cd demo
D:\Just4Study\Python\TestProgram\demo>dir
驱动器 D 中的卷是 work
卷的序列号是 F4A9-7648
D:\Just4Study\Python\TestProgram\demo 的目录
2013/09/06 11:36 <DIR> .
2013/09/06 11:36 <DIR> ..
2013/09/06 11:36 <DIR> demo
2013/09/06 11:36 250 scrapy.cfg
1 个文件 250 字节
3 个目录 124,353,531,904 可用字节
D:\Just4Study\Python\TestProgram\demo>cd demo
D:\Just4Study\Python\TestProgram\demo\demo>dir
驱动器 D 中的卷是 work
卷的序列号是 F4A9-7648
D:\Just4Study\Python\TestProgram\demo\demo 的目录
2013/09/06 11:36 <DIR> .
2013/09/06 11:36 <DIR> ..
2013/09/06 11:36 265 items.py
2013/09/06 11:36 258 pipelines.py
2013/09/06 11:36 448 settings.py
2013/09/06 11:02 <DIR> spiders
2013/08/28 05:46 0 __init__.py
4 个文件 971 字节
3 个目录 124,353,531,904 可用字节
这回终于和http://doc.scrapy.org/en/latest/intro/tutorial.html 上描述的一致了。
到这里,我想是安装好了。