[问题]scrapy pipelines 无法写入CSV档

楼主: allen511081 (蓝)   2015-01-07 17:00:47
继上次发问pipelines的问题后,已经自行解决keyerror的问题,
这次是无法写入CSV档,明明执行的时候都没出现错误讯息,但就是写不进去
CSV档,附上程式码
pipelines.py
import csv
from myproject.items import BirdTitle,BirdName,BirdCount
class myPipeline(object):
def __init__(self):
self.myCSV = csv.writer(open('birds.csv','wb'))
self.myCSV.writerow(['title','birdname','count'])
def process_item(self, item,spider):
titles=[]
names=[]
counts=[]
for title in item:
if isinstance(item, BirdTitle):
for title in item['title']:
titles.append(title)
return titles
for name in item:
if isinstance(item, BirdName):
for name in item['birdName']:
names.append(name)
return names
for count in item:
if isinstance(item,BirdCount):
for count in item['count']:
counts.append(count)
return counts
for a, b,c in zip(titles,names,counts):
self.myCSV.writerow([a, b,c])
return item
请问我该如何解决?
作者: alibuda174 (阿哩不达)   2015-01-07 17:33:00
process_item的作用是? 为何那么多个returnitem给个例子吧
楼主: allen511081 (蓝)   2015-01-07 20:56:00
process_item 是在处理蜘蛛爬来的资料return是返回list,总共有三个listitem[titleA,titleB,nameA,nameB,countA,countB]把item处理成三个list后,就是titles,names,counts
作者: alibuda174 (阿哩不达)   2015-01-07 23:30:00
return titles 那process_item不就结束了...
作者: ug945 (ug945)   2015-01-07 23:59:00
只要 在最后return item就好了 其他的都不用
楼主: allen511081 (蓝)   2015-01-08 09:57:00
已经处理掉3个return了,CMD画面有抓到东西,但是最后就是存不进去CSV档里面,档案里就只有headerline

Links booklink

Contact Us: admin [ a t ] ucptt.com