Sunday 30 June 2013

Data Mining Process - Why Outsource Data Mining Service?

Overview of Data Mining and Process:
Data mining is one of the unique techniques for investigating information to extract certain data patterns and decide to outcome of existing requirements. Data mining is widely use in client research, services analysis, market research and so on. It is totally based on mathematical algorithm and analytical skills to drive the desired results from the huge database collection.

Information mining is mostly used by financial analyzer, business and professional organization and also there are many growing area of business that are get maximum advantages of data extract with use of data warehouses in their small to large level of businesses.

Most of functionalities which are used in information collecting process define as under:

* Retrieving Data

* Analyzing Data

* Extracting Data

* Transforming Data

* Loading Data

* Managing Databases

Most of small, medium and large levels of businesses are collect huge amount of data or information for analysis and research to develop business. Such kind of large amount will help and makes it much important whenever information or data required.

Why Outsource Data Online Mining Service?

Outsourcing advantages of data mining services:
o Almost save 60% operating cost
o High quality analysis processes ensuring accuracy levels of almost 99.98%
o Guaranteed risk free outsourcing experience ensured by inflexible information security policies and practices
o Get your project done within a quick turnaround time
o You can measure highly skilled and expertise by taking benefits of Free Trial Program.
o Get the gathered information presented in a simple and easy to access format

Thus, data or information mining is very important part of the web research services and it is most useful process. By outsource data extraction and mining service; you can concentrate on your co relative business and growing fast as you desire.

Outsourcing web research is trusted and well known Internet Market research organization having years of experience in BPO (business process outsourcing) field.

If you want to more information about data mining services and related web research services, then contact us.


Source: http://ezinearticles.com/?Data-Mining-Process---Why-Outsource-Data-Mining-Service?&id=3789102

Thursday 27 June 2013

Web Mining - Applying Data Techniques


Web mining refers to applying data techniques that discover patterns that are usually on the web. Web mining comes in three different types: content mining, structure mining and usage mining, each and every technique has its significance and roles it will depend on which company someone is.

Web usage mining

Web usage mining mainly deals with what users are mainly searching on the web. It can be either multimedia data or textual data. This process mainly deals with searching and accessing information from the web and putting the information into a one document so that it can be easily be processed.

Web structure mining

Here one uses graphs and by using graphs one can be able to analyze the structure and node of different websites how they are connected to each other. Web structure mining usually comes in two different ways:

One can be able to extract patterns from hyperlinks on different websites.

One can be able to analyze information and page structures which will describe XML and HTML usage. By doing web structure mining one can be able to know more about java script and more basic knowledge about web design.

Advantages

Web mining has many advantages which usually make technology very attractive and many government agencies and corporations use it. Predictive analysis ones does not need a lot of knowledge like in mining. Predictive analytics usually analyze historical facts and current facts about the future events. This type of mining has really helped ecommerce one can be able to do personalize marketing which later yield results in high trade volumes.

Government institutions use mining tools to fight against terrorism and to classify threat. This helps in identifying criminals who are in the country. In most companies is also applicable better services and customer relationship is usually applied it gives them what they need. By doing this companies will be able to understand the needs of customers better and later react to their needs very quickly. By doing this companies will be able to attract and retain customers and also save on production cost and utilize the insight of their customer requirements. They may even find a customer and later provide the customer with promotional offers to the customer so that they can reduce the risk of losing the customer.

Disadvantages

The worst thing that is a threat to mining is invasion of privacy. Privacy in is usually considered lost when documents of one person is obtained, disseminated or used especially when it occurs without the presence of the person who came up with the data itself. Companies collect data for various reasons and purposes. Predictive analytics is usually an area that deals mainly with statistical analysis. Predictive analytics work in different ways deal with extracting information from the data that is being used and it will predict the future trends and the behavior patterns. It is vital for one to note that that accuracy will depend on the level of the business and the data understanding of the personal user.

Victor Cases has many hobbies and interests. As well being a keen blogger and article writer for many sites, he has also recently created a site focusing on web mining. The site is constantly being updated and has articles such as predictive analytics to read.


Source: http://ezinearticles.com/?Web-Mining---Applying-Data-Techniques&id=5054961

Tuesday 25 June 2013

Data Mining Is Useful for Business Application and Market Research Services

One day of data mining is an important tool in a market for modern business and market research to transform data into an information system advantage. Most companies in India that offers a complete solution and services for these services. The extraction or to provide companies with important information for analysis and research.

These services are primarily today by companies because the firm body search of all trade associations, retail, financial or market, the institute and the government needs a large amount of information for their development of market research. This service allows you to receive all types of information when needed. With this method, you simply remove your name and information filter.

This service is of great importance, because their applications to help businesses understand that it can perform actions and consumer buying trends and industry analysis, etc. There are business applications use these services:
1) Research Services
2) consumption behavior
3) E-commerce
4) Direct marketing
5) financial services and
6) customer relationship management, etc.

Benefits of Data mining services in Business

• Understand the customer need for better decision
• Generate more business
• Target the Relevant Market.
• Risk free outsourcing experience
• Provide data access to business analysts
• Help to minimize risk and improve ROI.
• Improve profitability by detect unusual pattern in sales, claims, transactions
• Major decrease in Direct Marketing expenses

Understanding the customer's need for a better fit to generate more business target market.To provide risk-free outsourcing experience data access for business analysts to minimize risk and improve return on investment.

The use of these services in the area to help ensure that the data more relevant to business applications. The different types of text mining such as mining, web mining, relational databases, data mining, graphics, audio and video industry, which all used in enterprise applications.


Source: http://ezinearticles.com/?Data-Mining-Is-Useful-for-Business-Application-and-Market-Research-Services&id=5123878

Monday 24 June 2013

Data Destruction Service

Data destruction is a term that refers to the removal or eradication of magnetic or optical computer storage media. The method of destruction varies, dependent upon the medium and method used in the process. The aim of data destruction is to physically destroy the data so as to remove the possibility of recovery.

Computer storage media requires some form of sanitization at the end of it's working life, particularly when it holds sensitive information that could inadvertently be read by third parties. This is extremely relevant to businesses and corporations, where data may contain information pertaining to the general public or third parties, such as clients. Similarly, confidential corporate information, including patent designs, business strategies and other sensitive data could easily be accessed by third parties if the data is not removed.

As I said at the outset, methods of destruction vary, depending upon storage medium. For each storage medium, a variety of destruction techniques also exist.

Optical media, such as cd roms, DVDs can be destroyed by granulating the plastics into 5mm chips. This method does not remove the data, but makes recovery near impossible. However, removal of the thin film that coats the top side of the disk, by scraping, scouring or sanding will physically destroy the data. By contrast, the use of microwave ovens, a less conventional technique, is highly effective due to the static charge and consequent arcing across the thin film storage layer of the disk.

Typical modern magnetic media constitutes tape backup units and Hard Disk drives. Unfortunately, tape backup units are very hard to destroy, due to the length of the tape, upon which a film of iron oxide layer retains the magnetic media. Shredding of such media is possible, but requires significant financial investment in plant capable of handling such devices. Acids, in particular, Nitric acid, at 50% concentration, will react violently with the iron oxide layer, destroying it completely within minutes. However, this process requires the removal of the outer plastic case to adequately expose the internal media storage tape. In some circumstances, incineration of the storage media may be an option. However, this may inadvertently expose the operator to carenogens and may be prohibited in certain countries.

The variety of Hard Disk drives available, method of connectivity (SATA, IDA, ATA, SCA, SCSI) means that data destruction software has had to be intelligent enough to differentiate between these different interfaces. Software driven destruction of hard drives is a highly efficient eradication technique that has been shrouded in urban myths, masking the true ease with which data can be permanently removed. In many instances, a single pass binary wipe (writing random zeros and ones to the drive) will permanently remove all data from the storage device. However, international standards exist, which all require more significant wiping processes, most of which will inadvertently result in complete failure of the hard drive due to the high temperatures generated.

Hard drive destruction may also be undertaken via other means, including the granulating of the drive to 5mm particulate level, use of a furnace to destroy and ultimately recover the Aluminium and use of corrosive materials such as Acids to remove the recording surfaces from the disk.

All the above methods are quantifiable, in that the destruction of the data can be confirmed, etheri via visual inspection of the drive storage device or by a software interface. However, the final data destruction technique, that known as degaussing, whereby a strong electromagnet is used to remove the magnetically stored data has no method of validation. It is for this reason that this has technique has been left until last. It is for this reason that the US government and military require the use of a deguasser that has been approved by the NSA.


Source: http://ezinearticles.com/?Data-Destruction-Service&id=6686357

Wednesday 19 June 2013

Usefulness of Web Scraping Services

For any business or organization, surveys and market research play important roles in the strategic decision-making process. Data extraction and web scraping techniques are important tools that find relevant data and information for your personal or business use. Many companies employ people to copy-paste data manually from the web pages. This process is very reliable but very costly as it results to time wastage and effort. This is so because the data collected is less compared to the resources spent and time taken to gather such data.

Nowadays, various data mining companies have developed effective web scraping techniques that can crawl over thousands of websites and their pages to harvest particular information. The information extracted is then stored into a CSV file, database, XML file, or any other source with the required format. After the data has been collected and stored, data mining process can be used to extract the hidden patterns and trends contained in the data. By understanding the correlations and patterns in the data; policies can be formulated and thereby aiding the decision-making process. The information can also be stored for future reference.

The following are some of the common examples of data extraction process:

• Scrap through a government portal in order to extract the names of the citizens who are reliable for a given survey.
• Scraping competitor websites for feature data and product pricing
• Using web scraping to download videos and images for stock photography site or for website design

Automated Data Collection
It is important to note that web scraping process allows a company to monitor the website data changes over a given time frame. It also collects the data on a routine basis regularly. Automated data collection techniques are quite important as they help companies to discover customer trends and market trends. By determining market trends, it is possible to understand the customer behavior and predict the likelihood of how the data will change.

The following are some of the examples of the automated data collection:

• Monitoring price information for the particular stocks on hourly basis
• Collecting mortgage rates from the various financial institutions on the daily basis
• Checking on weather reports on regular basis as required

By using web scraping services it is possible to extract any data that is related to your business. The data can then be downloaded into a spreadsheet or a database for it to be analyzed and compared. Storing the data in a database or in a required format makes it easier for interpretation and understanding of the correlations and for identification of the hidden patterns.

Through web scraping it is possible to get quicker and accurate results and thus saving many resources in terms of money and time. With data extraction services, it is possible to fetch information about pricing, mailing, database, profile data, and competitors data on a consistent basis. With the emergence of professional data mining companies outsourcing your services will greatly reduce your costs and at the same time you are assured of high quality services.



Source: http://ezinearticles.com/?Usefulness-of-Web-Scraping-Services&id=7181014

Monday 17 June 2013

How Web Data Extraction Services Will Save Your Time and Money by Automatic Data Collection

Data scrape is the process of extracting data from web by using software program from proven website only. Extracted data any one can use for any purposes as per the desires in various industries as the web having every important data of the world. We provide best of the web data extracting software. We have the expertise and one of kind knowledge in web data extraction, image scrapping, screen scrapping, email extract services, data mining, web grabbing.

Who can use Data Scraping Services?

Data scraping and extraction services can be used by any organization, company, or any firm who would like to have a data from particular industry, data of targeted customer, particular company, or anything which is available on net like data of email id, website name, search term or anything which is available on web. Most of time a marketing company like to use data scraping and data extraction services to do marketing for a particular product in certain industry and to reach the targeted customer for example if X company like to contact a restaurant of California city, so our software can extract the data of restaurant of California city and a marketing company can use this data to market their restaurant kind of product. MLM and Network marketing company also use data extraction and data scrapping services to to find a new customer by extracting data of certain prospective customer and can contact customer by telephone, sending a postcard, email marketing, and this way they build their huge network and build large group for their own product and company.

We helped many companies to find particular data as per their need for example.

Web Data Extraction

Web pages are built using text-based mark-up languages (HTML and XHTML), and frequently contain a wealth of useful data in text form. However, most web pages are designed for human end-users and not for ease of automated use. Because of this, tool kits that scrape web content were created. A web scraper is an API to extract data from a web site. We help you to create a kind of API which helps you to scrape data as per your need. We provide quality and affordable web Data Extraction application

Data Collection

Normally, data transfer between programs is accomplished using info structures suited for automated processing by computers, not people. Such interchange formats and protocols are typically rigidly structured, well-documented, easily parsed, and keep ambiguity to a minimum. Very often, these transmissions are not human-readable at all. That's why the key element that distinguishes data scraping from regular parsing is that the output being scraped was intended for display to an end-user.

Email Extractor

A tool which helps you to extract the email ids from any reliable sources automatically that is called a email extractor. It basically services the function of collecting business contacts from various web pages, HTML files, text files or any other format without duplicates email ids.

Screen scrapping

Screen scraping referred to the practice of reading text information from a computer display terminal's screen and collecting visual data from a source, instead of parsing data as in web scraping.

Data Mining Services

Data Mining Services is the process of extracting patterns from information. Datamining is becoming an increasingly important tool to transform the data into information. Any format including MS excels, CSV, HTML and many such formats according to your requirements.

Web spider

A Web spider is a computer program that browses the World Wide Web in a methodical, automated manner or in an orderly fashion. Many sites, in particular search engines, use spidering as a means of providing up-to-date data.

Web Grabber

Web grabber is just a other name of the data scraping or data extraction.

Web Bot

Web Bot is software program that is claimed to be able to predict future events by tracking keywords entered on the Internet. Web bot software is the best program to pull out articles, blog, relevant website content and many such website related data We have worked with many clients for data extracting, data scrapping and data mining they are really happy with our services we provide very quality services and make your work data work very easy and automatic.



Source: http://ezinearticles.com/?How-Web-Data-Extraction-Services-Will-Save-Your-Time-and-Money-by-Automatic-Data-Collection&id=5159023

Friday 14 June 2013

Data Mining

Data mining is the retrieving of hidden information from data using algorithms. Data mining helps to extract useful information from great masses of data, which can be used for making practical interpretations for business decision-making. It is basically a technical and mathematical process that involves the use of software and specially designed programs. Data mining is thus also known as Knowledge Discovery in Databases (KDD) since it involves searching for implicit information in large databases. The main kinds of data mining software are: clustering and segmentation software, statistical analysis software, text analysis, mining and information retrieval software and visualization software.

Data mining is gaining a lot of importance because of its vast applicability. It is being used increasingly in business applications for understanding and then predicting valuable information, like customer buying behavior and buying trends, profiles of customers, industry analysis, etc. It is basically an extension of some statistical methods like regression. However, the use of some advanced technologies makes it a decision making tool as well. Some advanced data mining tools can perform database integration, automated model scoring, exporting models to other applications, business templates, incorporating financial information, computing target columns, and more.

Some of the main applications of data mining are in direct marketing, e-commerce, customer relationship management, healthcare, the oil and gas industry, scientific tests, genetics, telecommunications, financial services and utilities. The different kinds of data are: text mining, web mining, social networks data mining, relational databases, pictorial data mining, audio data mining and video data mining.

Some of the most popular data mining tools are: decision trees, information gain, probability, probability density functions, Gaussians, maximum likelihood estimation, Gaussian Baves classification, cross-validation, neural networks, instance-based learning /case-based/ memory-based/non-parametric, regression algorithms, Bayesian networks, Gaussian mixture models, K-Means and hierarchical clustering, Markov models, support vector machines, game tree search and alpha-beta search algorithms, game theory, artificial intelligence, A-star heuristic search, HillClimbing, simulated annealing and genetic algorithms.

Some popular data mining software includes: Connexor Machines, Copernic Summarizer, Corpora, DocMINER, DolphinSearch, dtSearch, DS Dataset, Enkata, Entrieva, Files Search Assistant, FreeText Software Technologies, Intellexer, Insightful InFact, Inxight, ISYS:desktop, Klarity (part of Intology tools), Leximancer, Lextek Onix Toolkit, Lextek Profiling Engine, Megaputer Text Analyst, Monarch, Recommind MindServer, SAS Text Miner, SPSS LexiQuest, SPSS Text Mining for Clementine, Temis-Group, TeSSI®, Textalyser, TextPipe Pro, TextQuest, Readware, Quenza, VantagePoint, VisualText(TM), by TextAI, Wordstat. There is also free software and shareware such as INTEXT, S-EM (Spy-EM), and Vivisimo/Clusty.

Source: http://ezinearticles.com/?Data-Mining&id=196652

Thursday 13 June 2013

Data Discovery vs. Data Extraction


Looking at screen-scraping at a simplified level, there are two primary stages involved: data discovery and data extraction. Data discovery deals with navigating a web site to arrive at the pages containing the data you want, and data extraction deals with actually pulling that data off of those pages. Generally when people think of screen-scraping they focus on the data extraction portion of the process, but my experience has been that data discovery is often the more difficult of the two.

The data discovery step in screen-scraping might be as simple as requesting a single URL. For example, you might just need to go to the home page of a site and extract out the latest news headlines. On the other side of the spectrum, data discovery may involve logging in to a web site, traversing a series of pages in order to get needed cookies, submitting a POST request on a search form, traversing through search results pages, and finally following all of the "details" links within the search results pages to get to the data you're actually after. In cases of the former a simple Perl script would often work just fine. For anything much more complex than that, though, a commercial screen-scraping tool can be an incredible time-saver. Especially for sites that require logging in, writing code to handle screen-scraping can be a nightmare when it comes to dealing with cookies and such.

In the data extraction phase you've already arrived at the page containing the data you're interested in, and you now need to pull it out of the HTML. Traditionally this has typically involved creating a series of regular expressions that match the pieces of the page you want (e.g., URL's and link titles). Regular expressions can be a bit complex to deal with, so most screen-scraping applications will hide these details from you, even though they may use regular expressions behind the scenes.

As an addendum, I should probably mention a third phase that is often ignored, and that is, what do you do with the data once you've extracted it? Common examples include writing the data to a CSV or XML file, or saving it to a database. In the case of a live web site you might even scrape the information and display it in the user's web browser in real-time. When shopping around for a screen-scraping tool you should make sure that it gives you the flexibility you need to work with the data once it's been extracted.


Source: http://ezinearticles.com/?Data-Discovery-vs.-Data-Extraction&id=165396

Tuesday 11 June 2013

Data Recovery Equipment

Today, computers are an integral and indispensable part of the IT world. No matter what your line may be, finance, education, business consulting and investigation, IT information security, or else. In fact, most people always take it for granted. You should never brag your computer can be failure-free.

The foremost use of computer is data storage. All the data is stored on a physical disk named hard disk drive which is a magnetic layer. And it is more likely to be stricken by a wide variety of reasons, such as a partition lost, system can not access, human mistake (accidental reformatting, deletion), file corruption, power surge, and virus attack, to the worst, these physical level failures typically are head crash, platter scratch, and motor failures caused by overwriting, physical damages, natural disasters, etc.

Sometimes a hard drive has been stricken dead or not working at all without any warning signs, but some other times there may be some clues that something is going bad or amiss. Such changes in performance or sudden blue screens are telltale signs that the hard drive may be on its way to collapse. The most obvious and common sign are clicking, squealing, scraping or grinding noises.

The computer become more involved in our daily life, so the danger of data loss also surfaces.

As most of us have already experienced data loss, it could be frustrated and traumatic, when you finally find your critical data are not able to recover. As a matter of fact, logical failures as I previously mentioned, a data recovery software program can simply work out, but speaking of physical failures, No! Those drives with minor physical failures will need a special equipment to repair hard drive itself or recover data.

Why data recovery software will stop there? The ordinary user-level repeated-read access method that is used by imaging software bring a risk of damaging the disk and head, making data lost irretrievable. Also the software skips bad sectors directly in order not to get hang (freeze). Even so it gets hang most of the time in case the drive has lost of bad sectors. Plus, there is no guarantee that all the data will be extracted as much as possible, though days or weeks of time wasted on imaging bad drives. That's why you should avoid it at all cost.

A unique data recovery equipment known as Data Compass is mostly used among experts and practitioners worldwide, where traditional tools can not reach the height. Data Compass reads data of each sector physically byte-to-byte, including good and bad, and copy to a good disk using its data extraction software and hardware. "Shadow Disk" technology allows Data Compass to maximally avoid further damage to the drive, and ensure the data is not lost from repeated recovery attempts.

Technically speaking, it is hard to figure out how exactly the data can be recovered. It all depends. In most cases, data will be able to recover as long as the parts of hard drive are not severely damaged, otherwise you should swap its components then like platters, heads, and a spindle motor, for example.

A current tool named "hard drive head/platter exchange professional" used for drive disassembling and head/platter exchange will be replaced by the vendor soon. The change is made for optimization reason, and the new product is a better enhancement; plus, the new platter exchanger allows users to work on hard drive with spacers between platters.

If you have known much about data recovery and if you have a craving for this field, you should start your own business with a right equipment and then you can be an expert. Of course, it is not easy to find a proper option from current data recovery equipments with sky-high price in economy hard times. It is even worse when comes to new versions of software to products you are possessing, vendors will charge every time. In this case, free of charge upgrade service is the way to go.



Source: http://ezinearticles.com/?Data-Recovery-Equipment&id=1947719

Friday 7 June 2013

Data Mining Explained

Overview
Data mining is the crucial process of extracting implicit and possibly useful information from data. It uses analytical and visualization techniques to explore and present information in a format which is easily understandable by humans.

Data mining is widely used in a variety of profiling practices, such as fraud detection, marketing research, surveys and scientific discovery.

In this article I will briefly explain some of the fundamentals and its applications in the real world.

Herein I will not discuss related processes of any sorts, including Data Extraction and Data Structuring.

The Effort
Data Mining has found its application in various fields such as financial institutions, health-care & bio-informatics, business intelligence, social networks data research and many more.

Businesses use it to understand consumer behavior, analyze buying patterns of clients and expand its marketing efforts. Banks and financial institutions use it to detect credit card frauds by recognizing the patterns involved in fake transactions.

The Knack
There is definitely a knack to Data Mining, as there is with any other field of web research activities. That is why it is referred as a craft rather than a science. A craft is the skilled practicing of an occupation.

One point I would like to make here is that data mining solutions offers an analytical perspective into the performance of a company depending on the historical data but one need to consider unknown external events and deceitful activities. On the flip side it is more critical especially for Regulatory bodies to forecast such activities in advance and take necessary measures to prevent such events in future.

In Closing
There are many important niches of Web Data Research that this article has not covered. But I hope that this article will provide you a stage to drill down further into this subject, if you want to do so!

Should you have any queries, please feel free to mail me. I would be pleased to answer each of your queries in detail.


Source: http://ezinearticles.com/?Data-Mining-Explained&id=4341782

Wednesday 5 June 2013

Data Mining Tools - Understanding Data Mining

Data mining basically means pulling out important information from huge volume of data. Data mining tools are used for the purposes of examining the data from various viewpoints and summarizing it into a useful database library. However, lately these tools have become computer based applications in order to handle the growing amount of data. They are also sometimes referred to as knowledge discovery tools.

As a concept, data mining has always existed since the past and manual processes were used as data mining tools. Later with the advent of fast processing computers, analytical software tools, and increased storage capacities automated tools were developed, which drastically improved the accuracy of analysis, data mining speed, and also brought down the costs of operation. These methods of data mining are essentially employed to facilitate following major elements:

    Pull out, convert, and load data to a data warehouse system
    Collect and handle the data in a database system
    Allow the concerned personnel to retrieve the data
    Data analysis
    Data presentation in a format that can be easily interpreted for further decision making

We use these methods of mining data to explore the correlations, associations, and trends in the stored data that are generally based on the following types of relationships:

    Associations - simple relationships between the data
    Clusters - logical correlations are used to categorise the collected data
    Classes - certain predefined groups are drawn out and then data within the stored information is searched based on these groups
    Sequential patterns - this helps to predict a particular behavior based on the trends observed in the stored data

Industries which cater heavily to consumers in retail, financial, entertainment, sports, hospitality and so on rely on these data methods of obtaining fast answers to questions to improve their business. The tools help them to study to the buying patterns of their consumers and hence plan a strategy for the future to improve sales. For e.g. restaurant might want to study the eating habits of their consumers at various times during the day. The data would then help them in deciding on the menu at different times of the day. Data mining tools certainly help a great deal when drawing out business plans, advertising strategies, discount plans, and so on. Some important factors to consider when selecting a data mining tool include the platforms supported, algorithms on which they work (neural networks, decisions trees), input and output options for data, database structure and storage required, usability and ease of operation, automation processes, and reporting methods.


Source: http://ezinearticles.com/?Data-Mining-Tools---Understanding-Data-Mining&id=1109771

Monday 3 June 2013

Screen Scrap. Data Scraping. Web Data Mining.

One more reason to be careful about what you say online. The invasion of the web scraper.

Have you ever heard of the term “data scraping” or “website scraping” or “data harvesting”? If you have not, then welcome to the world of one-more-thing-you-have-to-worry-about.

The market for personal information about Internet users has become BIG business.  When you think of data scraping, think of “scrap metal”, or the harvesting of organs, or the collection of anything valuable someone can make money from selling.

Back in October 2010, the Wall Street Journal broke the story about how Nielsen Co., the media-research firm, better known as a web scraper, was hired to monitor the online “buzz” for its clients. In particular, pharmaceutical companies that were desperate to gather information about consumers’ wants, desires, needs, moods, and habits.  Companies are willing to pay a lot of money to identify the personal habits of the buying public.

Where do they get this valuable information from? YOU, through the Internet, specifically social media sites, résumé sites, chat rooms, discussion boards, and any other online forum, where people disclose personal identifiable information about themselves.


Source: http://francineward.com/screen-scrap-data-scraping-web-data-mining/