Post-PC what?

Photo of a vintage computer
Photo by bert sz on Unsplash

The ‘Post-PC’ era was supposed to signal a shift away from traditional computing (ie laptops and desktops). It was supposed to be the age of the smartphone and tablet. I discuss a recent Asymco post by Horace Dediu and provide my own thoughts on the Post-PC idea.

Now here’s a term that I haven’t heard in a while. “Post-PC.” Horis Dediu discusses the term in his recent blog post at Asymco.

For those not familiar, the “Post-PC era” was supposed to be a societal shift away from traditional desktop computing toward a more mobile world. As Dediu points out, the term Post-PC was popularized by David D. Clark from MIT. Steve Jobs also popularized the idea, as it was in the context of the Post-PC era that he discussed the iPod, iPhone, and iPad (more so the latter two). 

Jobs famously argued at the D10 conference that the personal computer (including Mac and Windows) was like a truck. A workhorse for any given situation. And like trucks, PCs wouldn’t be the favoured vehicle type forever. Jobs theorized that other devices, like the smartphone and tablet, would steal away the more casual users. PCs would still be around, but they would remain the work devices (the trucks) while phones and tablets would be the cars.

Continue reading on Tech Bytes

Apple’s M1 chip: Making desktop computing cool again

Apple M1 chip: Image courtesy of Apple

My intention was not for this website to become the Apple blog, but the famous fruit company seems to be dominating much of the news cycle this year.

This past summer, at the World Wide Developer Conference (WWDC), Apple announced that it would be transitioning from Intel to its in-house Apple Silicon over the next two years. Apple said the first Macs with in-house silicon would arrive by the end of 2020. Well, here we are, and we have a new MacBook Air, 13 inch MacBook Pro, and a Mac mini.

The form factors of these devices are identical, but the real magic is the new M1 chip inside. Is it as fast as Apple claims? Will it run our apps properly? What does the M1 chip mean for desktop computing generally? In this article, I will try and answer these questions.

Continue reading on Tech Bytes

Conference presentation: ‘Toward Open Pragmatism’, OE Global 2020 Conference


Although open licensing is a necessary component of open educational resources, the overall openness of a resource is determined by several factors beyond licensing. This paper examines the applicability of the “Open Enough” framework (McNally & Christiansen, 2019) for examining the openness of existing Open CourseWare (OCW). This previously published conceptual framework proposed eight factors that educators should consider when creating a new, or adopting an existing, open course. These factors include Copyright/Open Licensing Frameworks, Accessibility/Usability Formatting, Language, Support Costs, Assessment, Digital Distribution, File Format, and Cultural Considerations. In this study, the researchers aimed to answer the following three research questions.

1. Are these factors robust enough to analyze (or measure) the level of openness in existing OCW?

2. Are additional, or modified, factors necessary?

3. Are certain factors impractical for assessment?


For this analysis, the researchers randomly selected five recent open courses from two prominent OCW databases – TU Delft and MIT OpenCourseWare. The researchers came to two broad conclusions following a thorough analysis of the OCW sample.


Overall, the framework was an effective tool for analyzing open courseware, though cultural considerations and usability proved to be too subjective and were removed from the framework. The study revealed the level of openness among the sampled courses to be highly inconsistent. Some factors, assessment, for example, were consistently open across the sample while language, material costs and file format often quite closed. The consistent lack of editable materials was particularly surprising and led the researchers to draw some conclusions about what openness should mean for Open CourseWare. The researchers used the data to revise their existing conceptual framework into a more actionable guideline for open educators.

Conference page

PDF slides and presentation transcript

Conference presentation: ‘How open is it?’, OpenEd 2020 Conference


While open licensing is a foundational aspect of open educational resources, there are several “factors” that educators must use to achieve openness in their course design. This study builds on the previous work of the authors’ conceptual framework, titled “Open Enough?,” for evaluating the level of openness within Open CourseWare (OCW) (McNally & Christiansen, 2019). In the previous work, the authors proposed eight factors that educators should consider when undertaking OCW development. The authors also argued that these eight factors could be used to assess the openness of existing OCW. The goal of this pilot study was to answer the following question:

1) Is the “Open Enough” framework and its eight factors robust enough to analyze (or measure) the level of openness in an existing OCW?

2) Are additional, or modified, factors necessary?

3) Are the factors practical measures for the assessment of existing OCW? Are there particular factors which are too subjective or too broad?

For this analysis, the authors randomly selected five recent open courses from two prominent OCW databases – TU Delft and MIT OpenCourseWare – for a total of ten OCW. Each course was assessed on each of the eight factors which included Copyright/Open Licensing Frameworks, Accessibility/Usability Formatting, Language, Support Costs, Assessment, Digital Distribution, File Format, and Cultural Considerations. The level of openness of each factor was classified as Closed, Mixed, or Most Open – recognizing that these buckets of analysis are broad and could further be subdivided.

In general, the “Open Enough” framework was fairly effective for determining openness in existing OCW with some caveats. The Cultural Considerations and Usability factors were ultimately too subjective to measure and were subsequently removed from the revised version of the framework. The analysis of these OCW showed that openness among the sampled courses was inconsistent. Some of the factors were consistently open throughout the sampled courses while other factors, specifically Language, Materials Costs, and File Format, were quite closed. Overall, there was a lack of editable materials that led the authors to reconsider what openness should be in the context of OCW. The results of the analysis were used to revise the framework. This pilot study served as a proof of concept for using their framework as a tool for analysis.

EdTech Examined #13: Burnout

Episode 13 'Burnout' cover art

In this episode, Erik and Kris discuss the Descript video editing software, how universities continue to adopt new technologies, and the mental health impacts of the COVID-19 pandemic. They also recommend their top apps for managing mental health and provide more organization strategies for managing the demands of online learning.

Twitter: @EdTechExamined

EdTech Examined #12: Teaching in Virtual Reality

AltSpaceVR selfie of Dr. Tony Chaston, MRU Psychology,

In this episode, Erik and Kris interview Dr. Anthony (Tony) Chaston, an Associate Professor in the Department of Psychology at Mount Royal University, in Calgary, Canada. Tony teaches Sensation & Perception and Research Methods at MRU. His research interests include visual perception and cognition and the use of virtual reality for reducing anxiety. Most recently, Tony has been developing a virtual reality course using the AltSpaceVR platform. Erik and Kris talk with Tony about his teaching, research, and the future of higher-education.

Twitter: @EdTechExamined

EdTech Examined #11: Burner Phone

EdTech Examined podcast logo
#11: Burner Phone cover art

In this episode Kris and Erik discuss digital whiteboard software, burner phones and burner numbers, The September 15th Apple event (new iPads and Watches!), remote podcast recording, how to setup digital breakout rooms, broadcasting software, the cost of remote education, and Winter semester uncertainty.

Twitter: @EdTechExamined

Nvidia buying ARM opens a huge can of worms

Image credit:

The stunning news that Nvidia was moving to acquire ARM Holdings for $40 billion has led many in the tech industry to consider the possible implications of this merger.

The first thing that comes to my mind is the relationship between ARM and the many licensees that use technologies developed by ARM. Currently, Nvidia is the leader in graphics cards (GPUs). Nvidia also has a somewhat poor relationship with Apple, is that a potential conflict of interest? Does Nvidia have the power to sever all the licensing relationships with ARM’s various partners? ARM chips are used everywhere, so any company that relies on ARM licenses would have justification to be concerned.

Continue reading on Tech Bytes

Maximum fun: Do graphics matter as much in the next console generation?

Image from Pixabay

The launch of a new console generation is so very exciting. It’s one of the few things that genuinely makes me feel like I’m a kid again. Even if you’re not a gamer, new console generations are important milestones in the computer industry because they often bring cutting-edge and innovative technology to a larger number of people. This upcoming generation is no different from all the amazing new CPUs and GPUs provided by AMD, solid-state storage (SSD), faster memory, and (hopefully) faster load times. Graphics certainly get better with each generation, but does the increased graphical fidelity and realism matter as much in this generation? I hypothesize that it doesn’t. I believe we’re reaching a plateau of “maximum fun” (or fun saturation). What I mean is that more detailed graphics and higher resolution textures won’t necessarily lead to better gameplay at least for the time being. Rather, game fluidity as a result of higher frames-per-second (FPS) and good game mechanics are better indicators of a game’s replay-ability over time.

Continue reading on Tech Bytes

RISCy business: The future of x86

CC image courtesy of Gerd Altmann from Pixabay

Following the expected reveal that Apple was going to transition the Mac from Intel to Apple Silicon, I started thinking about what this would mean for the x86 architecture more broadly. This architecture has been at the heart of desktop computing for forty years, and I think it’s unlikely that Apple’s implementation of its own chips won’t have wider implications on the computer industry. Based on my understanding of ARM – the architecture that Apple Silicon is loosely based on – I think it’s likely that x86′, and particularly and Intel’s, days as the dominant desktop chip standard are numbered.

Continue reading on Tech Bytes