The ‘Post-PC’ era was supposed to signal a shift away from traditional computing (ie laptops and desktops). It was supposed to be the age of the smartphone and tablet. I discuss a recent Asymco post by Horace Dediu and provide my own thoughts on the Post-PC idea.
Now here’s a term that I haven’t heard in a while. “Post-PC.” Horis Dediu discusses the term in his recent blog post at Asymco.
For those not familiar, the “Post-PC era” was supposed to be a societal shift away from traditional desktop computing toward a more mobile world. As Dediu points out, the term Post-PC was popularized by David D. Clark from MIT. Steve Jobs also popularized the idea, as it was in the context of the Post-PC era that he discussed the iPod, iPhone, and iPad (more so the latter two).
Jobs famously argued at the D10 conference that the personal computer (including Mac and Windows) was like a truck. A workhorse for any given situation. And like trucks, PCs wouldn’t be the favoured vehicle type forever. Jobs theorized that other devices, like the smartphone and tablet, would steal away the more casual users. PCs would still be around, but they would remain the work devices (the trucks) while phones and tablets would be the cars.
My intention was not for this website to become the Apple blog, but the famous fruit company seems to be dominating much of the news cycle this year.
This past summer, at the World Wide Developer Conference (WWDC), Apple announced that it would be transitioning from Intel to its in-house Apple Silicon over the next two years. Apple said the first Macs with in-house silicon would arrive by the end of 2020. Well, here we are, and we have a new MacBook Air, 13 inch MacBook Pro, and a Mac mini.
The form factors of these devices are identical, but the real magic is the new M1 chip inside. Is it as fast as Apple claims? Will it run our apps properly? What does the M1 chip mean for desktop computing generally? In this article, I will try and answer these questions.
The stunning news that Nvidia was moving to acquire ARM Holdings for $40 billion has led many in the tech industry to consider the possible implications of this merger.
The first thing that comes to my mind is the relationship between ARM and the many licensees that use technologies developed by ARM. Currently, Nvidia is the leader in graphics cards (GPUs). Nvidia also has a somewhat poor relationship with Apple, is that a potential conflict of interest? Does Nvidia have the power to sever all the licensing relationships with ARM’s various partners? ARM chips are used everywhere, so any company that relies on ARM licenses would have justification to be concerned.
The launch of a new console generation is so very exciting. It’s one of the few things that genuinely makes me feel like I’m a kid again. Even if you’re not a gamer, new console generations are important milestones in the computer industry because they often bring cutting-edge and innovative technology to a larger number of people. This upcoming generation is no different from all the amazing new CPUs and GPUs provided by AMD, solid-state storage (SSD), faster memory, and (hopefully) faster load times. Graphics certainly get better with each generation, but does the increased graphical fidelity and realism matter as much in this generation? I hypothesize that it doesn’t. I believe we’re reaching a plateau of “maximum fun” (or fun saturation). What I mean is that more detailed graphics and higher resolution textures won’t necessarily lead to better gameplay at least for the time being. Rather, game fluidity as a result of higher frames-per-second (FPS) and good game mechanics are better indicators of a game’s replay-ability over time.
Following the expected reveal that Apple was going to transition the Mac from Intel to Apple Silicon, I started thinking about what this would mean for the x86 architecture more broadly. This architecture has been at the heart of desktop computing for forty years, and I think it’s unlikely that Apple’s implementation of its own chips won’t have wider implications on the computer industry. Based on my understanding of ARM – the architecture that Apple Silicon is loosely based on – I think it’s likely that x86′, and particularly and Intel’s, days as the dominant desktop chip standard are numbered.
If you’ve been a Mac user for a long time, you know that the community can be fickle. On one hand, there’s a consistent complaint that the Mac platform doesn’t get nearly enough attention from Apple compared to iOS/iPadOS. On the other hand, you know that this community is resistant to change. Every time there any significant changes to macOS’ look and feel, no matter how small, there’s seems to be immediate skepticism. The design changes coming to the Mac signify something bigger. It’s being introduced as part of a broader vision that will be brought forth when all Macs transition to Apple Silicon.
But change can be good. The changes coming to macOS Big Sur are divisive because the OS is clearly adopting a more iPad-like look and feel. It’s going to be different, and I would argue it’s a much more drastic design overhaul than the introduction of Yosemite in 2014. Many folks have focused on the iPad influences on the Mac, but I’d argue there’s more of a cross-pollination between these platforms. It’s clear to me that the platforms aren’t merging (at least not yet). But both are borrowing features from each other. This is an ecosystem play. These design changes will make it considerably easier to switch between the Mac and iPad, making owners of both happy campers. I’m not brave enough to install beta software on my primary machines, but from what I can tell macOS Big Sur and iPadOS 14 tells us a lot about the future of these platforms.
It’s no longer a rumour. Last week, Apple announced it was transitioning its entire line of Mac computers from Intel chips to its custom “Apple Silicon” over the next two years. Why is this transition so important? And, what will this mean for the Mac and the computer industry moving forward?
Working from home during COVID-19 means we need a decent computer. Unfortunately, with the economic uncertainty that the pandemic has brought, it’s not so easy to plunk down $1500 or more for a new laptop. The good news is that our laptops are lasting longer than ever.
I recently upgraded to a beefier MacBook Pro 16″, since I use my laptop for work, graphics, audio production and programming. Thankfully, my old 2013 13″ MacBook Pro is no slouch (see specs at bottom) and I got the battery replaced early last year. My wife showed interest since she has an older, slower version of the same computer. For Mac users, there are built-in tools, as well as some tricks, that can help you migrate data between computers. Here are some strategies for migrating the data and getting your computer organized.
I discuss the release of the new 2020 iPhone SE and what it means for the tech industry. I dub this the year of “trickle-down technology.”
It’s the meaning behind this device that I find so fascinating. With the entry-level iPad, Apple was able to use an older design and a slower (though still blazing fast) CPU to get the tablet’s price point down. Aside from the recycled body, this iPhone has entirely new internals. That means all the Apple’s phones run the same CPU, at least until new devices come in September. Why not do this for their tablets? Why can’t the iPad, iPad Air, and iPad Pro all run the A12Z and keep the price variations. Perhaps this is the end-goal.
Now that I’ve had a week or more to play around with the cursor support in iPadOS 13.4, I’m ready to share some brief thoughts and impressions. There’s lots of coverage of this on all the tech websites, so I won’t retread old ground. What’s most interesting to me is what’s to come. The last two years of iPad software updates have been productivity-focused, and it’s likely there’s more to come for iPadOS 14 and beyond.