Tag Archives: research

New Paper: Strategic analysis of a water rights conflict in the south western United States

A new paper by PhD Candidate Simone Philpot has just been published in the Journal of Environmental Management! Download a copy here: http://authors.elsevier.com/a/1T8CR14Z6tPQ~k 

Simone, along with co-authors Dr. Keith Hipel and Dr. Peter Johnson, uses the Graph Model for Conflict Resolution to model the longstanding dispute over water allocation between Nevada and Utah. This modeling process allows for new insights into how different actors perform in different situations. Congrats to Simone for publishing her work in a very prestigious venue!


A strategic analysis of the ongoing conflict between Nevada and Utah, over groundwater allocation at Snake Valley, is carried out in order to investigate ways on how to resolve this dispute. More specifically, the Graph Model for Conflict Resolution is employed to formally model and analyze this conflict using the decision support system called GMCR+. The conflict analysis findings indicate that the dispute is enduring because of a lack of incentive and opportunity for any party to move beyond the present circumstances. Continued negotiations are not likely to resolve this conflict. A substantial change in the preferences or options of the disputants, or new governance tools will be required to move this conflict forward. This may hold lessons for future groundwater conflicts. It is, however, increasingly likely that the parties will require a third party intervention, such as equal apportionment by the US Supreme Court.


  • Trans-boundary resource management;
  • Groundwater;
  • Water rights;
  • Decision support system;
  • Conflict analysis

Group Members at AAG 2016: San Francisco

Looking forward to some sun (or fog) in California! Myself and two group members, Sara Harrison and Qing Lu are off to present at the Association of American Geographers annual conference in San Francisco. I’ve pasted the links to sessions below. Also check out other Geothink.ca presenters as well. Hope to see you there!

The Annual Meeting of the American Association of Geographers will be in San Francisco, CA from March 29 to April 2.

Sara Harrison

Crowdsourcing in Emergency Management: A comparative analysis of crowdsourcing adoption by governments in the United States and Canada


Qing Lu 

Potential and challenges of mobile technologies in the public sector: a case study of 311 requests in Edmonton, Canada


Peter Johnson

Reflecting on the Success of Open Data: How Municipal Governments Evaluate Open Data Programs



Measuring the Value and Impact of Open Data: Recruiting Doctoral Students

I’ve recently been successful with obtaining five years of funding from the Ontario Ministry of Research and Innovation’s Early Researcher Award (ERA). This generous funding will allow me to measure the value and impact of open data initiatives, assessing how open data is accessed, used, and exploited. This research will directly impact how governments provide open data and how stakeholders such as private developers, other governments, non-profits, and citizens build applications and businesses models that rely on open data.

As part of this award, I am now currently recruiting for graduate students (PhD students in particular) that are interested in working with me on open data topics, with a focus on government provision, measuring value, and the development of metrics. If you are interested in these topics, please take a look at my comments for prospective students and the Faculty of Environment Dean’s Doctoral Initiative page for funding opportunities.

This work will build on my current open data work as a part of the SSHRC-funded Partnership Grant geothink.ca, led by Dr. Renee Sieber at McGill University.

SSHRC Insight Development Grant Award: Establishing the Value of Open Data

I’ve recently been fortunate to be awarded a SSHRC Insight Development Grant, along with Dr. Pamela Robinson from Ryerson University and Dr. Renee Sieber from McGill University, to Establish the Value of Open Data. This grant runs for two years and aims to:

1) establish the existing value of open data as reported by diverse user communities (government, non-profit, community organizations, private sector developers); 2) detail the current limitations and opportunities inherent in the open data provision system, from a multiple stakeholder perspective; and 3) derive a set of metrics to guide the evaluation of open data strategies at all levels of government, assessing possible constraints to adoption.

This research will make important contributions to current academic discourse on the value derived from government open data and the potential for open data to form a basis for citizen engagement, tracking the changing nature of the relationship between government and citizen. Key users and audiences of this research include both public and private sector organizations. Principally, this research will clarify for governments who accesses their data and how that data is used or exploited. Tracing this system of open data access and usage has implications for how government provides open data and how stakeholders such as private developers, non-profits, and citizens build durable applications and businesses models that rely heavily on open data.

SSHRC listing of awardees:


Geopatial Mobility Lab – Launched with support from CFI and ORF

I’ve recently been awarded funding from the Canadian Foundation for Innovation and the Ontario Research Fund. I’d like to thank both of these government funding agencies  for their support of a new research and training initiative that I call the ‘Geospatial Mobility Lab’. This effort is also co-sponsored through direct contributions of equipment and services from Esri Canada and Dell Computer.

I am actively recruiting students at the Masters and PhD levels to participate in research using this infrastructure. If you are interested, please read this and get in touch with me.

The Geospatial Mobility Lab in brief:

The widespread adoption of Internet-connected mobile devices has signaled a shift in the way that geographic information is both delivered and gathered. No longer tethered to desks, terminals, and Wi-Fi networks, location-based applications are now a key part of the mobile computing experience, providing a conduit for communication with space and place as a permanent backdrop. This project will develop a first-of-its kind testbed, the Geospatial Mobility Lab, an integrated system of mobile devices and analytic infrastructure, for the systematic evaluation of geospatial information and mobile technology. The Geospatial Mobility Lab will generate benefits for Canada and Canadians in three areas: the generation of direct economic benefits through software and use case developments conducted in partnership with private companies; training benefits through creating employees with marketable skills in software design, deployment, and evaluation; and generate social benefits in understanding the affordances and constraints of mobile device use on individual interactions, communications, and spatial behaviour. Considering the widespread adoption of mobile devices within society and the continued growth of this area of the information technology sector, research findings will impact many of the millions of Canadian citizens who use mobile devices on a daily basis.

See the official funding announcement here.

Tweet-Mapping American TV Ratings

This past winter semester I launched a new course at the University of Waterloo called “The Geoweb and Location-Based Services“. This 4th-year course introduced senior undergraduate students to the theoretical concepts and practical techniques of Web 2.0, Volunteered Geographic Information, Open Data, the Geoweb, and location-based services using mobile phones. As part of this course, students worked in groups to complete a major project.

One project that stood out was “Tweet-Mapping American TV Ratings” by the team of Andrea Minano, Sarah Knight, and Michael Goldring. The aim of their project was to analyze the relationship between social media and the popularity of television shows through ratings. To do this, they gathered data from the social media network Twitter. According to Socialguide.com, 32 million individuals in the United States tweeted about television in 2012. Additionally, studies recently conducted by the television ratings company Nielsen, suggest that Twitter is a robust way to derive TV ratings. Here you can see a map of TV shows from Friday March 22nd, with the location of individual tweets shown:

TV show tweets from Friday March 22nd
TV show tweets from Friday March 22nd

The use of Twitter to rate the popularity of TV shows was tested in this project by gathering tweets from March 18, 2013 to March 24, 2013, and then mapping their spatial distribution. Click here to view an interactive map of these results. These individual tweets were then aggregated to the state level, to give the most popular shows per state. These most-tweeted TV shows per state were then compared to official national TV ratings. For those Walking Dead fans, you will be pleased to note that Twitter is basically taken over on Sunday nights:

TV Tweet Map for Sunday, March 24th
TV Tweet Map for Sunday, March 24th

This series of maps were made with Leaflet, an open source web-mapping platform. Seven web-maps were created for each day of the week. In each of these, US states were symbolized according to its most-tweeted TV show. Pie charts can be seen by clicking on each state displaying the three most tweeted TV shows per state. Finally, two bar graphs accompany each map: one showing the three most tweeted TV shows at a national scale, and another showing the three highest-rated TV shows at a national scale.

Overall, the results proved the initial hypothesis correct since there was a clear correlation between most-tweeted TV shows and official TV ratings. However, it is important to note that the results continue to offer some limitations. For instance, there are data limitations because only 1% of tweets are geolocated, and the age of people using Twitter ranges primarily from 18 to 29 years. For this reason, TV shows that are popular with older audiences may not be tweeted about but continue to receive high ratings. Future studies may be conducted in this newly-researched subject; yet, it is evident that tweets have a relationship with TV ratings in the United States, and these can be effectively mapped to find any relevant spatial patterns.

New data sources and experimental options

I’ve made some major alterations to TourSim, both in the data that it relies on, and the types of experimentation it supports. I’m thinking that this is going to make TourSim much more usable for tourism planning, and begins to incorporate many of the ideas of complexity science (such as adaptation) into TourSim.

TourSim model

First, TourSim now uses tourist preference data from the 2004 Nova Scotia Tourist Exit Survey. This survey has a wider range of accommodation and activity options, and the types of categories represented relate much more intuitively to the types of tourism products available in Nova Scotia. Additionally, the number of responses included in the Tourist Exit Survey is considerably larger than the CTS and ITS I have previously been using. The Tourist Exit Survey also segments tourists based on generating market (Atlantic, Quebec, Ontario, Western Canada, New England, Other USA, and International). Each class of tourist has their own range of activity and accommodation preferences, and you can now see the percentage of each market that is arriving in Nova Scotia.

I’ve also improved the destination adaptation function. This is designed to represent destination development in response to high levels of visitation. Several steps are used to model this function:

1) Destination Capacity. Each destination has a maximum capacity for visits, based on occupancy data provided by a mandatory reporting program conducted by the Nova Scotia Department of Tourism, Culture, and Heritage. While this capacity varies considerably from season to season, this occupancy limit represents the maximum accommodation capacity if all accommodations are open.

2) Every month, the destination examines the number of tourists who have visited in that month. If the destination is at 80% of its capacity (this threshold is adjustable by the user), then the destination increases its capacity by 5% (this percentage again can be adjusted by the user).

3) Advertising: This adaptation function also works for destinations that don’t come close to their capacities. If a destination is below 30% of their capacity, the destination “advertises” and raises the likelihood that it will be randomly selected for evaluation by the tourist. Of course, this isn’t exactly how advertising works, but in the simplified world of TourSim, things are a bit different.

All of these variables can be manipulated by you, the user at the start of the model. Like with other versions you can select specific destinations to focus on, and compare simulation results produced with different variables. Check out the new scenario and let me know what you think!