I’ve finally published my M.A.Sc. thesis as a journal article, under the title Advances in Population Synthesis: fitting many attributes per agent and fitting to household and person margins simultaneously.
This article is the preferred citation going forward; I think it tells the story best:
A brief summary of the key contributions described in detail in my thesis
A better explanation of the U.S. context and the applicability of this work outside Canada. Statistics Canada goes to great lengths to protect Canadian privacy, and some of my work was motivated by the particular difficulties associated with Canadian census data.
Many years ago, when Google first released Google Maps and revolutionized online mapping from the stagnant MapQuest era, I put together a few quick demos showing the Vancouver and Toronto transit maps. I’ve made a few updates over the years since then, but not much more. The Vancouver one is still quite popular – more popular than TransLink’s own map, to be honest – but other web gurus made better Toronto maps, such as the excellent one by Ian Stevens.
I’ve noticed that Google has revamped the mapping APIs and is preparing to eliminate version 2.0. The whole treatment of online mapping is changing rapidly, as the mobile market takes off. I was thinking of just scrapping the Toronto map since it’s not well-used – but then I thought a little further. What if I could make a proper map of the Greater Toronto Area? Ian’s map doesn’t cover that – in fact, since there isn’t even a good print product covering the full area. Perhaps I could make something useful for the “regional traveller” using GO, and also help mobile users who have trouble with Ian’s site.
I set to work, borrowing liberally from others. It’s a patchwork by nature, since each agency has its own colour and line conventions, but hopefully still useful. When Ian made his map, taking a bitmap image and turning it into tiles was a bleeding-edge endeavour and required painstaking effort – but the tools have improved a lot. Even still, a bitmap this size (35,000 pixels square) takes some horsepower. I didn’t have the energy to do everything Ian did (like removing the background); his map will still probably work better for most TTC riders. I also couldn’t figure out what map projection Oakville Transit used, and couldn’t get it to line up nicely with the other data.
I’ll probably do a few more revisions on this in the next few months – an adjustable opacity slider would be nice, a legend for each operator, and higher zoom levels. But I thought I’d release a beta version and see if anyone likes it, and see how expensive the bandwidth is. Version 3 of my map is now up (and version 2 is still around for anyone who wants it). New in this version:
Local transit operator maps
More mobile friendly: full-screen view by default, location-aware (uses GPS to detect your current location, if available)
May seem slower, unless you have a new browser, like Google Chrome or Firefox 4
Graphics updates: labels cleaner, interchange stations cleaner, labels always visible instead of showing on hover (for touchscreen users)
Search tries to find a transit station first, otherwise tries other non-transit locations
No legend… yet
Added “Get directions to here” link to each station
An appalling decision from the Canadian federal government today, reported by the Globe & Mail here: “Tories scrap mandatory long-form census”
The census is a vital data source for all sorts of transportation and land use planning. A voluntary census is nearly useless, since the sample will suffer from voluntary response bias. This will do nothing to reduce the number of analysts and bureaucrats – provincial governments will be forced to step in and collect the same data themselves, but this will inevitably result in the loss of province-to-province comparisons.
As for privacy, the alleged basis for this decision: Statistics Canada jumps through all sorts of hoops to ensure the privacy of respondents. It would be difficult if not impossible to connect any of the published census data back to an individual. Yes, the questions are detailed and probing; but the anonymization process used by Stats Can is tougher than anywhere else in the world that I’ve seen.
The judicial verdict is in on the sensational case of Michael Bryant. It sounds like a tragic case of a driver whose car’s stop/stall/roll action accidentally provoked an unstable bicycle courier, with tragic and ultimately fatal consequences. The driver appears to have behaved completely reasonably under the circumstances. The cyclist had a long history of aggressive confrontations and appears to have behaved in a threatening manner.
I take no issue with the facts of the case or the judgment. However, the incident has taken on large proportions in the media and the cycling community, and I find the official legal summary wanting in this regard. The crisp, neutral judicial language gives the document the air of Truth and Justice, when in fact it only represents The Law.
The document answers the legal problem at hand, a judgment on dangerous driving, and therefore focuses entirely on what constitutes “reasonable driver behaviour” under the circumstances. Lost in the context is what constitutes a reasonable cyclist’s reaction – and while the cyclist was not reasonable, an emotional reaction to having your rear wheel bumped is legitimate, and a subsequent furious reaction to being sent flying over the hood of the car is also fair. Because the document is necessarily focused on the Law, all such points on the public debate of cyclist/driver perspectives and emotions are out of the picture. Also lost in the discussion is any mention of the differences between car/car collisions from car/bike collisions – what may be a fender-bender in one context is an unnerving experience in the other, even if the vehicle doesn’t touch the cyclist’s skin, or if the vehicle is only going 13 km/h when it sends the cyclist flying over the hood.
The cycling community’s discontented reaction to the case stems from a desire for the driver world to “please understand our feelings!” Unfortunately, this particular cyclist’s aggressiveness makes it an unlikely to elicit any soul-searching. Note however the pattern: whether the cyclist is in the wrong (this case) or in the right (2008 story with a cyclist’s amputated leg), the cyclist will always be the one who gets injured or killed.
The morals: cyclists must hold their tempers, no matter the incident. And, there’s a profound lack of mutual understanding and respect still out there on the streets.
Growing up in Toronto, I was a six-month cyclist and six-month pedestrian/transit rider. Since moving back a few years ago, I’ve been shifting to closer to ten months of cycling. I realized that I feel much better when I get that daily exercise and sunshine, and it’s considerably faster for getting around, chaining trips and running errands.
In the process, I’ve been trying to find the right bike for the job, and have just bought a pricy Dutch bike for the coming winter. My summer bike is out of the question; it’s a nice bike, and far too vulnerable to the winter salt, grit and filth. Bicycle #1. I bought this road bike at the nadir of my student bank balance for $100. It’s as old as I am and a little too small. It handled reasonably well on the winter streets, the narrow tires were good at punching through the snow to find pavement, and the vintage handlebar-end shifters were easy to use with big mitts. The caliper brakes are the deal-breaker though: quite weak in wet conditions, and so tight around the wheel that I can’t have both fenders and knobbly tires. I rode it over the two winters of 2006-2008, when I lived in an apartment building. In that building, I could use the underground parking garage for cleaning and regular maintenance, and the bike thawed overnight in slightly-above-zero conditions. Rusting was a major problem: a new chain and rear cluster every spring, often cables as well, and a lot of surface oxidization wherever the paint had chipped off. Continue reading Winter cycling, Dutch style→
The Copenhagenize blog has some interesting thoughts on cycling and subcultures: do the various cycling subcultures (racers, couriers, mountain bikers) in North America get in the way of making cycling appealing to normal citizens? The subcultures define themselves by gear or attitude – and I think this is quite offputting for normal people. It’s like a world where choosing to drive made everyone think you were a Formula One fan or a fix-your-own-car-guy. That said, the various European cycle chic blogs are perhaps guilty of pushing another subculture: beautiful people.
Joe Romm discusses a plausible worst case scenario for climate change. The UK’s Hadley Centre model was used to look at the effects of continued growth in fossil fuel use, including carbon feedbacks – and the results aren’t pretty. In 10% of the model runs, very high temperatures are seen as soon as 2060, and the North American numbers are 6-12°C rises.
Several years ago, I put together some Google Maps for the Vancouver and Toronto transit systems. In light of the expected opening of the Canada Line in Vancouver on August 17th, I took a shot at updating the maps.
In the interim, though, Google has made some big advances in its handling of transit. They have a full database of rapid transit stops in Toronto (GO and TTC subway), and a layer that shows the TTC subway lines as well. York Region has provided Google with full local bus data, including schedules, and Google Maps does a fairly nice job of showing that information. That said, the visuals for the transit system aren’t the most attractive, the lines showing the GO rail network are hard to see, and other major transit facilities don’t jump out at the viewer (like the York VIVA BRT Light system or the Spadina streetcar). And in Vancouver, Google still has zero data.
So, my maps still serve a purpose. The changes in this edition are:
Changed Toronto colour scheme to colour code by operator (TTC, GO, VIVA) rather than by line. Changed line thickness to represent “all-day” vs. “peak only” service
Added links to TTC and GO station websites
The debate over “what to include on the map” is growing in my mind. Should St. Clair and Spadina be in, since they have partial segregation from traffic? Should the north half of VIVA Blue really be in, when it has 15 minute frequencies in the peak hour and operates in mixed traffic?
Removed 98 B-Line and moved Canada Line to the “present day” map. Updated Canada Line alignment, added numbers of connecting buses.
Changed colours to match latest TransLink map, and changed line thickness to represent “all-day” vs. “peak only” service
Moved to more modern Google APIs now that they exist (e.g., GMarkerManager)
Removed labels from map – the Toronto map in particular was far to cluttered, and the speed penalty for showing the labels was too high. They’re still there, but only if you move the mouse over a station icon.
I’ve been busy reading more on transportation and climate change over the last few months. Before I post any more “big” articles, I wanted to take a moment to praise one pair of figures from David MacKay’s great book, Sustainable Energy Without the Hot Air. It’s available free online, but I thoroughly recommend buying the elegantly typeset, figure-filled hard copy.
MacKay’s great strength lies in communicating numbers: using these simple, visual representations, a single consistent set of units throughout the book, and back-of-the-envelope calculations that are designed to illustrate the cases at hand, he’s assembled a solid book. MacKay doesn’t cover the politics or economics, and he doesn’t frame the book in terms of climate change, although that’s clearly the key underlying motivation.
Here’s the background: an example of a simple visual story, showing why burning fossil fuel reserves represents a massive change to the climate system. I’m going to paraphrase MacKay to cut to the chase, but if you prefer, please read MacKay’s original text, pages 241-243.
Throughout the book, MacKay uses a type of figure where the area of a box represents its size. The bigger the box, the bigger the number, and it’s really easy to visually compare different boxes.
“Until recently, all these pools of carbon [shown in the above figure] were roughly in balance: all flows of carbon out of a pool (say, soils, vegetation, or atmosphere) were balanced by equal flows into that pool. The flows into and out of the fossil fuel pool were both negligible. Then humans started burning fossil fuels. This added two extra unbalanced flows, as shown [in Figure 2].”
The flows shown in Figure 2 (right) are from fossil fuels to the atmosphere, and from the atmosphere to the surface waters. As shown, roughly one quarter of the fossil carbon winds up in the oceans on a short timescale of 5-10 years; recent research indicates that this may be reducing as the surface waters saturate, however. Some carbon is moving into vegetation and soil, perhaps 1.5 GtC/y, but this is less well measured. What is observed today is that roughly half of the fossil carbon stays in the atmosphere.
Now, throw one more number into the mix: if fossil fuels remain our dominant energy source, then carbon pollution under “business as usual” is expected to emit 500Gt of carbon over the next 50 years, with roughly 100Gt expected to be stored in the surface waters of the ocean (at 2Gt/year). By looking at the figure, it is clear that the amount of carbon in the fossil fuel supply dwarfs the carbon in the atmosphere.
The figure shows quite clearly that if we’re talking about burning a large fraction of the remaining fossil fuels, we’re talking about major disruptions in the carbon balance. Something has to take up the carbon—either the atmosphere, surface waters, soils or vegetation, and all data to date suggests that the atmosphere is where the majority of the carbon winds up. Given the size of the fossil fuel slice, it’s easy to see how a doubling or tripling of atmospheric CO2 could happen within 50-100 years.
And yes, there’s a lot of carbon stored in the deep ocean—but as MacKay makes plainly clear in his discussion, the timescale for ocean mixing is thousands of years. “On a time-scale of 50 years, the boundary [between the surface waters and the rest of the ocean] is virtually a solid wall.”
Plus, the figure shows two risk areas: vegetation and soils from vicious cycles. If the high temperatures (caused by high atmospheric carbon) kill off vegetation or melt the tundra and release methane, large stores of carbon from the vegetation and soils categories could wind up in the atomsphere easily—and from the size of those boxes, it’s clear that could be a catastrophe. (As it happens, the IPCC A1FI scenario expects emissions of 1800-2500 GtC by 2100—I’d guess that means that much of the easily accessible fuels are burned, plus a substantial net release of carbon from vegetation and soils.)
At any rate, I like these figures because they clearly show this concept. By using a simple visual representation of quantity, and separating the steady state out from human-driven changes, MacKay cuts to the heart of the matter. Now, for comparison, here’s an IPCC figure (from their technical reports) showing carbon stores and fluxes (source is AR4 figure 7.3):
Confused yet? Black is preindustrial, red is human-caused. The IPCC figure can certainly be understood if you spend the effort, but it doesn’t leap off the page. Admittedly, it’s from a technical chapter that’s trying to communicate carbon fluxes rather than carbon stores; the comparison isn’t entirely fair. The IPCC has produced simpler figures (for other topics) in their policy-oriented chapters, but they could still borrow a few pages from MacKay’s book.
I’ve been a little baffled by the tar sands’ villainization in the climate literature for some time. While a few photographs can clearly show that the tar sands have dire impacts on the local environment (as a recent National Geographic special showed), I haven’t really understood why the tar sands have been singled out for so much venom in the climate literature. About two years ago, I first saw the stats: if you measure all emissions from extraction to the tailpipe (a “well-to-wheel” basis), the tar sands are between 15–40% worse than conventional oil. Why, then, is tar sands oil so much worse than—say—a vehicle with 15–40% poorer fuel efficiency? Here in Canada, the tar sands represent a lot of potential wealth and jobs; why should our oil get singled out relative to Saudi crude?
My reading recently took me past a figure that illuminated the problem for me in many ways. The figure below is adapted slightly from Farrell and Brandt, “Risks of the Oil Transition,” Environmental Research Letters 1(1), 2006, although I’m sure it can be found in many places in the literature.
The horizontal axis here represents the number of barrels of oil possible with each source, the vertical axis represents the greenhouse gas emissions per barrel, and the area of each bar is therefore the total emissions if all possible fuel of that type is extracted and used.
The black bar to the left of the axis represents the emissions from all oil burned to date. Everything to the right of the axis represents the potential emissions from conventional and unconventional oil, in order of price per barrel. The oil used to date is dwarfed but what remains in the ground and could be emitted in the future.
Coming back to the tar sands, a few points can be drawn from the figure: Continue reading Emissions and the Tar Sands→