6958 stories
·
166 followers

Ask Hackaday: What Goes Into A Legible Font, And Why Does It Matter?

1 Comment

Two patent front pages, on the left American with a serif font, on the right British with a sans serif font.
American and British patents, for comparison.

There’s an interesting cultural observation to be made as a writer based in Europe, that we like our sans-serif fonts, while our American friends seem to prefer a font with a serif. It’s something that was particularly noticeable in the days of print advertising, and it becomes very obvious when looking at government documents.

We’ve brought together two 1980s patents from the respective sources to illustrate this, the American RSA encryption patent, and the British drive circuitry patent for the Sinclair flat screen CRT. The American one uses Times New Roman, while the British one uses a sans-serif font which we’re guessing may be Arial. The odd thing is in both cases they exude formality and authority to their respective audiences, but Americans see the sans-serif font as less formal and Europeans see the serif version as old-fashioned. If you thought Brits and Americans were divided by a common language, evidently it runs much deeper than that.

But What Makes Text Easier To Read?

The font display page for the Atkinson Hyperlegible font.
Is this legible enough for you?

We’re told that the use of fonts such as Arial or Calibri goes a little deeper than culture or style, in that these sans-serif fonts have greater readability for users with impaired vision or other conditions that impede visual comprehension. If you were wondering where the hack was in this article it’s here, because many of us will have made device interfaces that could have been more legible.

So it’s worth asking the question: just what makes a font legible? Is there more to it than the presence or absence of a serif? In answering that question we’re indebted to the Braille Institute of America for their Atkinson Hyperlegible font, and to Mencap in the Uk for their FS Me accessible font. It becomes clear that these fonts work by subtle design features intended to clearly differentiate letters. For example the uppercase “I”, lowercase letter “l”, and numeral “1” can be almost indistinguishable in some fonts: “Il1”, as can the zero and uppercase “O”, the lowercase letters “g”, and “q”, and even the uppercase “B” and the numeral “8”. The design features to differentiate these letters for accessibility don’t dominate the text and make a font many readers would consider “weird”.

Bitmap Fonts For The Unexpected Win

The typeface used in the Commodore 8-bit machines. User:HarJIT, Public domain.

It’s all very well to look at scaleable fonts for high resolution work, but perhaps of more interest here are bitmap fonts. After all it’s these we’ll be sending to our little screens from our microcontrollers. It’s fair to say that attempts to produce smooth typefaces as bitmaps on machines such as the Amiga produced mixed results, but it’s interesting to look at the “classic” ROM bitmap fonts as found in microcomputers back in the day. After years of their just flowing past he eye it’s particularly so to examine them from an accessibility standpoint.

Machines such as the Sinclair Spectrum or Commodore 64 have evidently had some thought put into differentiating their characters. Their upper-case “Ii” has finials for example, and we’re likely to all be used to the zero with a line through it to differentiate it from the uppercase “O”. Perhaps of them all it’s the IBM PC’s code page 437 that does the job most elegantly, maybe we didn’t realise what we had back in the day.

So we understand that there are cultural preferences for particular design choices such as fonts, and for whatever reason these sometimes come ahead of technical considerations. But it’s been worth a quick look at accessible typography, and who knows, perhaps we can make our projects easier to use as a result. What fonts do you use when legibility matters?

Header: Linotype machines: AE Foster, Public domain.

Read the whole story
jepler
14 hours ago
reply
I certainly know I clung to bitmap fonts for my terminals for exactly this reason: every. single. pixel. was put where it was for legibility. Even in tiny fonts, like 5x7 pixels! However, eventually other concerns like extensive unicode support became marginally more important and I gave up. Not to mention that my everyday display's DPI went way up, while my eyes go way worse, so those 5x7 pixel characters eventually became illegible anyway. Back in the days of 640x480 LCD screens was a different matter.
Earth, Sol system, Western spiral arm
Share this story
Delete

Microsoft To Replace All C/C++ Code With Rust By 2030

1 Comment
Microsoft plans to eliminate all C and C++ code across its major codebases by 2030, replacing it with Rust using AI-assisted, large-scale refactoring. "My goal is to eliminate every line of C and C++ from Microsoft by 2030," Microsoft Distinguished Engineer Galen Hunt writes in a post on LinkedIn. "Our strategy is to combine AI and Algorithms to rewrite Microsoft's largest codebases. Our North Star is '1 engineer, 1 month, 1 million lines of code.' To accomplish this previously unimaginable task, we've built a powerful code processing infrastructure. Our algorithmic infrastructure creates a scalable graph over source code at scale. Our AI processing infrastructure then enables us to apply AI agents, guided by algorithms, to make code modifications at scale. The core of this infrastructure is already operating at scale on problems such as code understanding."

Hunt says he's looking to hire a Principal Software Engineer to help with this effort. "The purpose of this Principal Software Engineer role is to help us evolve and augment our infrastructure to enable translating Microsoft's largest C and C++ systems to Rust," writes Hunt. "A critical requirement for this role is experience building production quality systems-level code in Rust -- preferably at least 3 years of experience writing systems-level code in Rust. Compiler, database, or OS implementation experience is highly desired. While compiler implementation experience is not required to apply, the willingness to acquire that experience in our team is required."
Read the whole story
jepler
14 hours ago
reply
Oh yeah just have a stochastic process convert C code to Rust with no real understanding. it'll be fine.
Earth, Sol system, Western spiral arm
zwol
14 hours ago
It's gonna be Annex K all over again but worse.
jepler
13 hours ago
I assumed hating Annex K was just a generalized part of my Microsoft hate but turns out, no, it really is terrible. https://www.open-std.org/jtc1/sc22/wg14/www/docs/n1969.htm
jepler
13 hours ago
.. but instead of removing it they waited 6 years and then wrote a document to maybe address one of the problems (https://www.open-std.org/jtc1/sc22/wg14/www/docs/n2809.pdf - Fix set_constraint_handler_s in multithreaded environments) in a way that is surely not actually backwards compatible. They want to change the setting to be per-thread without introducing a _r variant...! madness. It does not seem to have ended up accepted in C23. Nevertheless, slibc has made it thread-local, in violation of annex K as written
zwol
3 hours ago
Yeah, and none of that even gets at the most elemental issue: the _goal_ of Annex K was to create a set of functions that you could _mechanically convert_ your legacy C code to use and it would no longer have buffer overflow bugs. Turns out that doesn't work. Fixing overflow bugs requires actual human attention at each site. The best a mechanical Annex K conversion can do is turn RCE bugs into denial of service and silent data corruption bugs. ... And that's why I brought it up; "mechanically get rid of all of the C in Windows by 2030" is chasing the same impossible dream.
jepler
1 hour ago
or you can look at the endless nothing that Safe C++, cppfront, profiles, and CPP Core Guidelines have actually done for any open source project I've actually seen in the wild.
Share this story
Delete

HariFun #221 - (mostly) 3D printed Glockenspiel Christmas Tree

1 Share
From: hwiguna
Duration: 1:14
Views: 55

Read the whole story
jepler
1 day ago
reply
Earth, Sol system, Western spiral arm
Share this story
Delete

Boléro

2 Shares
I perform Maurice Ravel's Boléro on a variety of homemade 8-bit instruments.
Read the whole story
jepler
3 days ago
reply
Earth, Sol system, Western spiral arm
Share this story
Delete

Formula 1 is Deploying New Jargon for 2026

1 Comment
Formula 1's 2026 technical regulations bring not only smaller and lighter cars but an entirely new vocabulary that fans and commentators will need to learn before the season opens in Australia in March. The drag reduction system that has been part of F1 racing since 2011 is gone, replaced by a suite of modes governing how the new active front and rear wings behave and how the hybrid powertrain delivers power. Straight Mode lowers both the front and rear wings to cut drag on designated straights, and unlike the outgoing DRS system any driver can activate it regardless of their proximity to other cars. The story adds: And there's corner mode, where the wings are in their raised position, generating downforce and making the cars corner faster. Those names are better than X-mode and Z-mode, which is what they were being called last year.

[...] Instead of using DRS as an overtaking aid, the hybrid power units will now fulfill that role. Overtake mode, which can be used if a driver is within a second of a car ahead, gives them an extra 0.5 MJ of energy and up to 350 kW from the electric motor up to 337 km/h -- without the Overtake mode, the MGU-K tapers off above 290 km/h. There's also a second Boost mode, which drivers can use to attack or defend a position, that gives a short burst of maximum power.

Read the whole story
jepler
4 days ago
reply
holographic "?" blocks on the course when
Earth, Sol system, Western spiral arm
Share this story
Delete

Neutrino Transmutation Observed For the First Time

1 Comment

Once upon a time, transmutation of the elements was a really big deal. Alchemists drove their patrons near to bankruptcy chasing the philosopher’s stone to no avail, but at least we got chemistry out of it. Nowadays, anyone with a neutron source can do some spicy transmutation. Or, if you happen to have a twelve meter sphere of liquid scintillator two kilometers underground, you can just wait a few years and let neutrinos do it for you. That’s what apparently happened at SNO+, the experiment formally known as Sudbury Neutrino Observatory, as announced recently.

The scinillator already lights up when struck by neutrinos, much as the heavy water in the original SNO experiment did. It will also light up, with a different energy peak, if a nitrogen-13 atom happens to decay. Except there’s no nitrogen-13 in that tank — it has a half life of about 10 minutes. So whenever a the characteristic scintillation of a neutrino event is followed shortly by a N-13 decay flash, the logical conclusion is that some of the carbon-13 in the liquid scintillator has been transmuted to that particular isotope of nitrogen.

That’s not unexpected; it’s an interaction that’s accounted for in the models. We’ve just never seen it before, because, well. Neutrinos. They’re called “ghost particles” for a reason. Their interaction cross-section is absurdly low, so they are able to pass through matter completely unimpeded most of the time. That’s why the SNO was built 2 KM underground in Sudbury’s Creighton Mine: the neutrinos could reach it, but very few cosmic rays and no surface-level radiation can.  “Most of the time” is key here, though: with enough liquid scintillator — SNO+ has 780 tonnes of the stuff — eventually you’re bound to have some collisions.

Capturing this interaction was made even more difficult considering that it requires C-13, not the regular C-12 that the vast majority of the carbon in the scintillator fluid is made of. The abundance of carbon-13 is about 1%, which should hold for the stuff in SNO+ as well since no effort was made to enrich the detector. It’s no wonder that this discovery has taken a few years since SNO+ started in 2022 to gain statistical significance.

The full paper is on ArXiv, if you care to take a gander. We’ve reported on SNO+ before, like when they used pure water to detect reactor neutrinos while they were waiting for the scintillator to be ready. As impressive as it may be, it’s worth noting that SNO is no longer the largest neutrino detector of its kind.

Read the whole story
jepler
4 days ago
reply
The paper refers to "⁸B neutrinos". Based on the wikipedia article this seems to mean they are neutrinos that are given off as a side effect of a reaction chain that involves a short lived atom of beryllium (symbol B) with atomic weight of 8. These neutrinos "stand out because of their higher average energies", which is presumably part of why the experiment targets them specifically. (https://en.wikipedia.org/wiki/Solar_neutrino#:~:text=decays%20into%20beryllium%2D8)
Earth, Sol system, Western spiral arm
Share this story
Delete
Next Page of Stories