Thursday, October 8, 2015

Selenium Tutorial: Web Scraping with Selenium and Python [ by argument passing example: python filename.py 2015/05/05 ]

                 

           Web Scraping with Selenium and Python





Imagine what would you do if you could automate all the repetitive and boring activities you perform using internet, like checking every day the first results of Google for a given keyword, or download a bunch of files from different websites.

In this code you’ll learn to use Selenium with Python, a Web Scraping tool that simulates a user surfing the Internet. For example, you can cial accounts, simulate a user to test your web application, and anything you find in your daily live that it’s repetitive. The possibilities are infinite! :-) 

Here my example code for scrap the data from the sports website. grab all the data  and filter the data according to category's like football,cricket,basketball etc , this code will help you to detail understand about the working selenium with python ,and how to  scrap the data using the technology 

Requirements:
  
     Step 1 : Create Virtual ENV 

               You need to install virtual environments in your local machine if virtualenv is installed in                    your system create a virtualenv using this command : virtualenv scrapy. if you dont                            installed the virtual env install virtualenv in your root in your machine : sudo pip install                      virtualenv. activate the env using source scrapy/bin/activate.

    Step 2 : Install dependencies in your env.

  •                 BeautifulSoup==3.2.1
  •                 EasyProcess==0.1.9
  •                 PyVirtualDisplay==0.1.5
  •                 argparse==1.2.1
  •                 beautifulsoup4==4.4.1
  •                 selenium==2.47.3
  •                wsgiref==0.1.2
Step 3 :  download the code from the git hub and run it. you can see the script downloading the match               details accodring to category wise and make it in txt file.
              This code you can run it two way with argument and with arguments.
              if you run the code as python filename.py : you can see the details according to today and tomorrow. and if you run the code like python filename.py 2015/05/05 , you will get the match details according to the   this date ( 2015/05/05 ).

please make sure the pip installed in your machine.

My Script for scrapping is : scrapping file


      

                

                

16 comments:

  1. It’s great to have operations folks involve in requirement analysis & design phase so as to prepare them for production roll out, alert stakeholders on operations requirements early in cycle. It’s also essential for Development..Click here..... software testing Training in chennai

    If interested, you may take a look to the paper and presentation @ Best software testing Training in chennai

    ReplyDelete
  2. This comment has been removed by the author.

    ReplyDelete
  3. A nice article. I totally love the way you presented the topic.Hope to see you post soon again.

    Selenium Training in Chennai

    ReplyDelete
  4. This information you provided in the blog that is really unique I love it!! Thanks for sharing such a great blog. Keep posting..
    Python training
    Python training institute
    Python training course

    ReplyDelete
  5. This comment has been removed by the author.

    ReplyDelete

  6. Really awesome blog!!! I finally found a great post here.I really enjoyed reading this article. Thanks for sharing valuable information.

    Python

    Data Science

    Selenium

    ETL Testing

    AWS

    ReplyDelete
  7. Trainingicon is offering Python training in Delhi NCR. We are a training institute in Delhi and Noida.We provide industrial training in programming like Python, PHP, Web designing, R Programming etc so if any body is looking to get trained into any skills, just let us know.following is the link to get enrilled into python batch
    Python Training in Delhi

    ReplyDelete
  8. Thanks for sharing the best information and suggestions, I love your content, and they are very nice and very useful to us. If you are looking for the best Web scraping company, then visit datamam. I appreciate the work you have put into this.

    ReplyDelete
  9. This comment has been removed by the author.

    ReplyDelete
  10. This comment has been removed by the author.

    ReplyDelete
  11. This comment has been removed by the author.

    ReplyDelete
  12. This comment has been removed by the author.

    ReplyDelete