March 26, 2014

News Video on the Web

Methodology

The data for this report were collected in three main parts: original survey work conducted by the Pew Research Center, an original content audit of websites of local and national television news outlets, and secondary aggregation and analysis by Pew Research Center of data generated by other researchers or organizations.

For the secondary analyses, original sources are cited throughout the report. We studied the data closely to determine where elements reinforced each other and where there were apparent contradictions or gaps. In doing so, the Pew Research Center’s Journalism Project endeavored to determine the value and validity of each data set. That in many cases involved going back to the sources that collected the research originally. Where data conflicted, we have included all relevant sources and tried to explain their differences, either in footnotes or in the narrative. We also sought insight from outside experts. Those readers raised questions, offered arguments and questioned data where they saw fit.

For the two areas of original research, detailed methods follow below.

Surveys

Digital News Participation, Omnibus Survey, Feb. 27 – March 2, 2014

The Princeton Survey Research Associates International (or PSRAI) February 2014 Omnibus Week 4 survey obtained telephone interviews with a nationally representative sample of 1,002 adults living in the continental United States. Telephone interviews were conducted by landline (500) and cellphone (502, including 272 without a landline phone). The survey was conducted by Princeton Survey Research Associates International. Interviews were done in English and Spanish by Princeton Data Source from Feb. 27 to March 2, 2014. Statistical results are weighted to correct known demographic discrepancies. The margin of sampling error for the complete set of weighted data is ± 3.6 percentage points.

Online Video Watching

2013, Omnibus Survey, July 25-28, 2013

The PSRAI July 2013 Omnibus Week 4 obtained telephone interviews with a nationally representative sample of 1,003 adults living in the continental United States. Telephone interviews were conducted by landline (501) and cell phone (502, including 230 without a landline phone). The survey was conducted by Princeton Survey Research Associates International. Interviews were done in English by Princeton Data Source from July 25 to 28, 2013. Statistical results are weighted to correct known demographic discrepancies. The margin of sampling error for the complete set of weighted data is ± 3.6 percentage points.

2009 Omnibus, June 18-21, 2009

The 2009 June Omnibus Survey obtained telephone interviews with a nationally representative sample of 1,005 adults living in the continental United States. The survey was conducted by Princeton Survey Research International. The interviews were conducted in English by Princeton Data Source from June 18 to 21, 2009. Statistical results are weighted to correct known demographic discrepancies. The margin of sampling error for the complete set of weighted data is ±3.6%.

2007 Tracking Survey, Feb. 15-17, 2007

This report is based on the findings of a daily tracking survey on Americans’ use of the internet. The results in this report are based on data from telephone interviews conducted by Princeton Survey Research Associates from Feb. 15 to March 7, 2007, among a sample of 2,200 adults, 18 and older.  For results based on the total sample, one can say with 95% confidence that the error attributable to sampling and other random effects is ±2.3 percentage points.  For results based on internet users, the margin of sampling error is ±2.8 percentage points.  In addition to sampling error, question wording and practical difficulties in conducting telephone surveys may introduce some error or bias into the findings of opinion polls.

Content Audit of News Websites

Pew Research Center staff audited 39 websites for the content analysis portion of this study — 32 local TV stations and 7 national news outlets. The sites were chosen to reflect local TV stations in small, medium and large markets, in all five regions of the country to represent affiliates of all major networks. They were:

Content Audit of News Websites

Each site was studied for a number of different elements. The total sample was divided into thirds (one third for each researcher on the project). A third of the sites were then examined beginning on Friday, Dec. 13; the next third on Monday, Dec. 16; the next third on Thursday, Jan. 2; then Tuesday, Jan. 7, and finally on Wednesday, Jan. 15. The purpose was to create a constructed five-day “week” spread out over the course of five weeks. The goal was to code each site on the list twice and to correct for any changes to the site or large news events that occurred during that time period.

1)   Researchers counted the total number of links and the total number of videos on the homepage. The percentage of video was then calculated by dividing the number of videos by the total number of links present on the homepage.

2)   Did the site have a dedicated video section on the homepage? Researchers looked at the homepage of each site to determine if the site had a dedicated section devoted to video content.

3)   Was the video on the channel’s website being hosted by another service, for example YouTube, Vimeo or Dailymotion, or was it hosted by the channel itself? (All of the channels studied here it was the latter, they hosted the video themselves.)

4)   Could the video be accessed in any of the following ways: iPhone app, Android app, Windows mobile app, BlackBerry app, iPad app, on Hulu, on Vimeo, on Dailymotion, on YouTube, or on Ustream? This variable reflects if the station or channel had its content on any other outside sites.

5)   Was there an option to stream live broadcasts on the station or channel’s website? If the option was available did it require any kind of login or registration to do so?