OpenMedia

Privacy Plan

Methodology

METHODOLOGY

This section describes the methodology we used for this crowdsourcing project. It is also designed to inform other organizations, in Canada and around the world, who may be considering using crowdsourcing methods for their outreach or policy work.

Although this project was both topic- and country-specific (“Privacy in Canada”), we believe our methods could quite straightforwardly be adapted for other topics and settings. For example, OpenMedia used similar methodology for a 2013-14 international project aimed at crowdsourcing fair copyright rules.

 

Project Background:

 

OpenMedia has been involved in privacy work since its foundation. As an Internet freedom organization, working for strong privacy safeguards, especially online, is a key pillar of our work.

Accordingly, we had been engaged for many years in a wide range of privacy-related campaigns and initiatives. As outlined in more detail in the introduction to this report, by mid-2014 it was clear that Canada faced a widening privacy deficit, solutions to which needed to be identified.

As a community-based organization, crowdsourcing has always been at the heart of OpenMedia’s values. We know from experience that often the best ideas and most penetrating insights come from everyday citizens. We therefore set out to ensure we could identify solutions to Canada’s privacy deficit through crowdsourcing. We adapted and built on lessons learned from previous crowdsourcing exercises we had been involved in, to capitalize on the potential of the Internet for participatory democracy.

Drafting of the questionnaire:

 

We decided to create a drag-and-drop tool to start our crowdsourcing questionnaire. From previous experience, we learned that the interactive nature of this tool proved popular as a way of engaging people in the subject matter. It also provided a low-barrier way for people to have their say, especially as we offered the option of submitting responses at each stage of the questionnaire, rather than requiring participants to complete the entire questionnaire before they could submit their answers.

The drag-and-drop tool questions were shaped by input from many people, including OpenMedia staff and community members, and colleagues and organizations from the Protect Our Privacy Coalition.

Thanks to previous crowdsourcing projects, OpenMedia has built up a good deal of institutional experience in this area, the lessons of which we were able to apply to the design of this report. Key lessons included the importance of being clear about the crowdsourced nature of the project, the role that participants would play, and the scope of the problem we were seeking a solution to. We also know the importance of ensuring that the questions are phrased in a way that clearly explains the issue while being accessible to a non-technical audience.

We also consulted leading privacy experts and organizations, and we are particularly grateful for their assistance when it came to the wording of the more detailed questions about potential privacy reforms.

We aimed to keep the questions fairly accessible and high-level, particularly when it came to the content for the drag-and-drop tool. We decided that we would obtain the most useful information by asking participants to rank a range of pressing high-level privacy priorities in order. Their priorities, in turn, would shape the overall direction of this report. “Require a warrant…” and “End blanket surveillance…” were by some distance the top 2 priorities selected by participants, and each of these recommendations play a prominent role throughout this report.

That said, we also wanted to give participants the opportunity to weigh in on specific potential reforms, such as those proposed by the federal Privacy Commissioner. For this reason, as participants worked through the crowdsourcing tool, the questions became progressively more detailed and specific, although we strove to ensure that all questions were accessible and understandable for the general public.

Finally, we provided participants with an opportunity to provide open-ended feedback. 562 participants (or 5.6% of the total) did so. Again these often detailed comments helped shape and inform the overall direction of the report, and many of them are published throughout.

 

Development of the Online Tool

 

Once the final shape of the questionnaire had been determined, we proceeded to develop the online version of the tool. This was developed by OpenMedia’s in-house web development team. Our team built the tool largely using Drupal 7, with some additional development work required to ensure the drag-and-drop tool functioned as intended.

A number of technical challenges had to be overcome, notably in ensuring that the order in which each drag-and-drop option appeared was randomized for each user. A further technical challenge was designing a system whereby each user who completed the tool was assigned a unique URL which they were encouraged to share on social media. We also created a leaderboard to publicize which users had succeeded in encouraging the most new participants.

We worked to ensure the tool would be usable by participants on a range of devices, including tablets and smartphones. A separate version of the drag-and-drop tool was built for mobile device users, with a pull-down menu replicating the drag-and-drop functionality. The tool was also able to detect whether a participant was using a mobile device, and to serve up the correct device-specific version automatically. The tool was tested extensively on a range of devices prior to launch.

 

 

Launching the questionnaire:

 

In order to maximize participation in this project, we ran an extensive publicity campaign around the launch of the tool. This encompassed sustained social media outreach, including share images, along with securing traditional and online media coverage. We also emailed OpenMedia supporters who had previously been involved with our privacy-related activities to encourage them to take part.

As outlined above, we also encouraged people who completed the tool to share it with their friends, colleagues, and networks via social media. We offered modest prize packages to incentivize this sharing. These prizes were designed to appeal to people interested in privacy, for example by including a year’s subscription to the Canadian VPN service Tunnelbear.

We also created a leaderboard so that people could keep track of their progress, the promotion of which was a key component in our overall messaging. With crowdsourcing it is critical that as many people as possible participate. Using online tools in ways that gamify the user experience can be extremely helpful to increasing participation.

We sustained publicity, especially on social media, throughout the eight weeks in which the tool was ‘live’ and available for people to participate in. High-profile figures including Margaret Atwood and Antonia Zerbisias supported us by taking part in the project, and by encouraging others to do so via their social networks.

We also intensified our publicity activities in the run-up to the deadline, with a “last chance” message to ensure that people who had considered using the tool had a chance to do so before the close of the public participation period.

 

In-person events

 

Although the tool was primarily designed to be used online, we also created an offline version of the first drag-and-drop question for use at in-person events. With the assistance of volunteers, we organized three in-person events - one each in Vancouver, Montreal, and Halifax. See Chapter 4: The Process in this report for more details.

 

Facebook Town Hall

 

As part of our crowdsourcing work for this project, OpenMedia hosted a Facebook Town Hall about privacy issues. Tom Henheffer, executive director of our coalition partner at Canadian Journalists for Free Expression co-hosted the event with OpenMedia’s Steve Anderson. We had a lively discussion, as Steve and Tom fielded questions from Canadians on privacy issues. The NDP’s digital issues critic Charmaine Borg also joined the debate. We reached over 46,000 people with this event, and input from this Town Hall helped shape this report.

 

Data Collection and Analysis

 

We made the survey freely available online, in order that individuals who wished to participate could do so without being contacted by email. Overall, 10,107 people took part, and we estimate a majority of these were already members of OpenMedia’s community.

We also encouraged and incentivized participants to share the tool through social media, thereby introducing elements of chain referral sampling into our total sample population. In total, 2371 respondents (23 percent of the total sample) were recruited by other participants through social media.

The data collection period spanned from October 22, 2014 to November 30, 2014, after which the results were analyzed.

 

Adaptation for a Regional or International target audience

 

While this project’s scope was specific to Canada, it should be relatively straightforward to pitch a similarly designed project at a smaller (regional) or larger (international) audience. The scope of the questions could quite easily be adjusted to make sense to the target audience in question.

We would recommend creating multilingual versions of the tool for international projects where a significant percentage of the target audience does not speak English. As noted below, we were unable due to resource constraints to produce a French-language version, dampening response rates in parts of Canada, particularly Quebec.

This project was pitched at Canadians, who live in a country of around 35 million people. For projects pitched at smaller populations, we recommend putting thought into how to ensure your response rate is sufficient. For example, this could mean placing a high priority on publicity activities surrounding the tool to compensate for the smaller target population.

 

Limitations of our Methodology

 

Due to the design of the survey, respondents had the ability to submit their responses and exit the survey at any time, without needing to answer every question. We thought this was preferable to forcing participants to complete the entire survey at the risk of reducing the response rate.

For this reason, Question 1 of the tool, which asked participants to rank a set of six priorities, received 10,107 responses, as it required a response from users before they could proceed. Other questions received varying numbers of responses, as detailed above and noted throughout the report.

We did not collect demographic information about participants, in part because asking for such information could have dissuaded some people from taking part. For this reason, our sense of the limitations with regards to demographics is speculative. Aside from the limitations of a voluntary response bias introduced by our sampling method, the absence, aside from a small number of in-person events, of an alternative mode of offline data collection would have further limited our sample population to people affluent enough to have easy access to the Internet.

One major limitation of the project and survey was the lack of multilingual content, and therefore the lack of discussion and involvement from non-English speaking Canadians. Most notably, this impacted response rates in parts of Quebec. Resource constraints meant that we were unable to consult them with a French-language version of the tool.