用户注册



邮箱:

密码:

用户登录


邮箱:

密码:
记住登录一个月忘记密码?

发表随想


还能输入:200字
云代码 - python代码库

新闻爬虫

2020-10-22 作者: zlqwerty123举报

[python]代码库

import urllib.request as request
import easygui
import bs4
import csv
import os

def is_connect():
    import requests
    try:
        requests.get("https://www.baidu.com")
        return True
    except:
        return False

if is_connect():
    here = os.getcwd()

    while True:
        link = easygui.enterbox("请输入新闻文件保存路径", "新闻爬虫", here)
        try:
            open(link+r"\新闻数据.csv","w")
            break
        except:
            easygui.msgbox("路径错误或文件已打开")

    url = "http://news.sohu.com/"
    req = request.Request(url,headers={"User-Agent":"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/86.0.4240.80 Safari/537.36 Edg/86.0.622.43"})
    response = request.urlopen(req).read().decode("utf-8")

    soup = bs4.BeautifulSoup(response,"html.parser")
    news = soup.findAll("a")
    news2 = soup.findAll("b")
    passed_news = []

    with open(link+r"\新闻数据.csv","w",newline="",encoding="utf-8") as f:
        writer = csv.writer(f)
        for i in news2:
            print(i.string)
            writer.writerow([i.string])

        for new in news:
            if not "None" in str(new.string) and len(str(new.string).replace(" ","").replace("\n","")) > 6:
                passed_news.append(str(new.string).replace(" ","").replace("\n",""))

        for new in passed_news[:-4][1:]:
            print(new)
            writer.writerow([new])
        f.close()
        a = input("已爬取数据(按Enter键退出)")
else:
    easygui.msgbox("请连接网络")
    input("按Enter键退出")


网友评论    (发表评论)

共1 条评论 1/1页

发表评论:

评论须知:

  • 1、评论每次加2分,每天上限为30;
  • 2、请文明用语,共同创建干净的技术交流环境;
  • 3、若被发现提交非法信息,评论将会被删除,并且给予扣分处理,严重者给予封号处理;
  • 4、请勿发布广告信息或其他无关评论,否则将会删除评论并扣分,严重者给予封号处理。


扫码下载

加载中,请稍后...

输入口令后可复制整站源码

加载中,请稍后...