Latest does not mean greatest and how Resolution is not everything…

We’re constantly surrounded by advertising that promises the latest phone, gadget, or “smart” device will transform our lives. But how much of that is genuine progress, and how much is clever marketing. At what point does innovation end and salesmanship begin. Not long ago, each new generation of phones or cameras arrived with clear, tangible improvements—better resolution, brighter displays, longer battery life, smoother performance. Yet in 2026, it’s fair to ask whether these updates are still meaningful, or whether we’re being sold change for the sake of change.

Higher specs and new features may look impressive on paper, but they don’t automatically translate into a better user experience. The rise of companion apps, cloud‑dependent features, and constant internet reliance is a good example of this. Moore’s law once suggested that technological capability would reliably double each generation, but that curve has slowed—whether due to resource limitations, manufacturing challenges, or simply a public with less disposable income and less appetite for constant upgrades.

It’s easy to point to smartphones as the clearest example of this shift, evolving from simple communication tools into pocket‑sized entertainment and production studios. But an earlier comparison is just as revealing: the evolution of televisions and home video. Growing up in the 1990s, most TVs were 4:3 and limited to what we now call standard definition—360p to 560p—broadcast over the air. At the time, the graininess or rigid programming schedules didn’t bother me. Today, though, it’s hard to imagine going back. We’ve moved from SD 4:3 to HD 16:9, then to 4K, and even 8K (though 8K adoption has stalled compared to earlier leaps). Some users now even prefer ultra‑wide 21:9 or 32:9 displays. But has this escalation in resolution and screen size actually resulted in better content. Does owning a high‑end camera guarantee great filmmaking, or could it be that creative limitations often push people toward more inventive storytelling.

We now live in a prosumer era where, in theory, anyone can produce cinema‑grade content with gear they already own. Tools once reserved for professionals have become mainstream, and the gap between consumer and professional equipment has narrowed dramatically. What hasn’t changed is the skill required. A phone may shoot 8K, but resolution alone doesn’t make a shot beautiful—composition, lighting, colour, and intention do.

When I was freelancing, tight budgets forced me to work with whatever equipment I had. What I learned is that the difference between the latest gear and a five‑year‑old setup is often far smaller than people assume. The real variable is the person behind the camera—their time, effort, and craft. You can own the most expensive equipment in the world and still take poor photos or make weak films if you lack the skills or patience to use it well. It’s far better to master what you already have than to overspend on gear that won’t magically improve your work. Technology loses value quickly, and it will never replace the human eye, the human hand, or the human imagination.

Next
Next

Does Tech limit Creativity and Innovation?