6962 stories
·
166 followers

UK Company Sends Factory With 1,000C Furnace Into Space

1 Comment
A UK-based company has successfully powered up a microwave-sized space factory in orbit, proving it can run a 1,000C furnace to manufacture ultra-pure semiconductor materials in microgravity. "The work that we're doing now is allowing us to create semiconductors up to 4,000 times purer in space than we can currently make here today," says Josh Western, CEO of Space Forge. "This sort of semiconductor would go on to be in the 5G tower in which you get your mobile phone signal, it's going to be in the car charger you plug an EV into, it's going to be in the latest planes." The BBC reports: Conditions in space are ideal for making semiconductors, which have the atoms they're made of arranged in a highly ordered 3D structure. When they are being manufactured in a weightless environment, those atoms line up absolutely perfectly. The vacuum of space also means that contaminants can't sneak in. The purer and more ordered a semiconductor is, the better it works.

[...] The company's mini-factory launched on a SpaceX rocket in the summer. Since then the team has been testing its systems from their mission control in Cardiff. Veronica Viera, the company's payload operations lead, shows us an image that the satellite beamed back from space. It's taken from the inside of the furnace, and shows plasma - gas heated to about 1,000C -- glowing brightly. [...]

The team is now planning to build a bigger space factory -- one that could make semiconductor material for 10,000 chips. They also need to test the technology to bring the material back to Earth. On a future mission, a heat shield named Pridwen after the legendary shield of King Arthur will be deployed to protect the spacecraft from the intense temperatures it will experience as it re-enters the Earth's atmosphere.

Read the whole story
jepler
7 minutes ago
reply
I'm really doubting the economics on this one. "10,000" chips is a meaningless figure. Or, more specifically, it's a figure designed to sound like a lot when it could be very little.

If you want to be extremely generous (and of course: AI!!), assume they are making a massive chip: the 762mm2 "GB202" nvidia blackwell chip. You'll need somewhat more than 100 "300mm" dies, or 12kg of finished silicon die using wikipedia's mass figure. A whole mission to earth orbit, just to make 12kg of chip-stuff!

But if they're engaging in sleight of hand, and you know they are, they could be talking about a much simpler chip. The RP2350 from Raspberry Pi is only about 49mm2, or less than 1/15 the area. At the most absurd, they could be talking about "chips" like SN74LVC1G97 which fits in a 1mm2 *package*, meaning the die is well under 1mm2 in area. You could put over 10k of those on a 200mm wafer with a mass of just 53 grams.

Honestly their whole website is a laugh. It's carbon negative! 15 tons of CO2 saved per 1kg "created" (their word). There is inadequate information given to even begin to assess this but based on further searching it's not passing a sniff test.

Huge number: the "Life-cycle greenhouse gas emissions" of coal [the worst choice listed on wikipedia's page "Greenhouse gas emissions] is 490g CO2/kWh, so we have 30612kWh per 1KG created. That's something like the total electricity my home uses in 5 years. I found https://www.sciencedirect.com/science/article/pii/S2666445323000041 detailing resources used in semiconductor manufacturing; assuming I'm interpreting it correctly, this is a key passage:

> In 2021, average water use, energy consumption, and GHG emissions were calculated to be 8.22 ​L/cm2, 1.15 ​kWh/cm2, and 0.84 ​kg CO2 equivalent/cm2, respectively, based on announced data

A 300mm die is 706cm2 and mass 125g, so in fact industry was emitting about 593kg CO2-equivalent per kg. So, unless this paper missed an addtional 14.5 *tons* of CO2 emissions it's simply not possible to save 15 tons per kg. But, y'all also added a whole fucking space launch to the math on your end and we haven't even tried to account for that...
Earth, Sol system, Western spiral arm
Share this story
Delete

Hugo van Kemenade: Replacing python-dateutil to remove six

1 Comment

The dateutil library is a popular and powerful Python library for dealing with dates and times.

However, it still supports Python 2.7 by depending on the six compatibility shim, and I’d prefer not to install for Python 3.10 and higher.

Here’s how I replaced three uses of its relativedelta in a couple of CLIs that didn’t really need to use it.

One #

norwegianblue was using it to calculate six months from now:

import datetime as dt

from dateutil.relativedelta import relativedelta

now = dt.datetime.now(dt.timezone.utc)
# datetime.datetime(2025, 12, 29, 15, 59, 44, 518240, tzinfo=datetime.timezone.utc)
six_months_from_now = now + relativedelta(months=+6)
# datetime.datetime(2026, 6, 29, 15, 59, 44, 518240, tzinfo=datetime.timezone.utc)

But we don’t need to be so precise here, and 180 days is good enough, using the standard library’s datetime.timedelta:

import datetime as dt

now = dt.datetime.now(dt.timezone.utc)
# datetime.datetime(2025, 12, 29, 15, 59, 44, 518240, tzinfo=datetime.timezone.utc)
six_months_from_now = now + dt.timedelta(days=180)
# datetime.datetime(2026, 6, 27, 15, 59, 44, 518240, tzinfo=datetime.timezone.utc)

Two #

pypistats was using it get the last day of a month:

import datetime as dt

first = dt.date(year, month, 1)
# datetime.date(2025, 12, 1)
last = first + relativedelta(months=1) - relativedelta(days=1)
# datetime.date(2025, 12, 31)

Instead, we can use the stdlib’s calendar.monthrange:

import calendar
import datetime as dt

last_day = calendar.monthrange(year, month)[1]
# 31
last = dt.date(year, month, last_day)
# datetime.date(2025, 12, 31)

Three #

Finally, to get last month as a yyyy-mm string:

import datetime as dt

from dateutil.relativedelta import relativedelta

today = dt.date.today()
# datetime.date(2025, 12, 29)
d = today - relativedelta(months=1)
# datetime.date(2025, 11, 29)
d.isoformat()[:7]
# '2025-11'

Instead:

import datetime as dt

today = dt.date.today()
# datetime.date(2025, 12, 29)
if today.month == 1:
 year, month = today.year - 1, 12
else:
 year, month = today.year, today.month - 1
 # 2025, 11
f"{year}-{month:02d}"
# '2025-11'

Goodbye six, and we also get slightly quicker install, import and run times.

Bonus #

I recommend Adam Johnson’s tip to import datetime as dt to avoid the ambiguity of which datetime is the module and which is the class.


Header photo: Ver Sacrum calendar by Alfred Roller

Read the whole story
jepler
1 day ago
reply
I had noticed that the dependency chain of dateutil was a big large (as python packages go) but I need it for its date parsing and don't know an equivalent...
Earth, Sol system, Western spiral arm
Share this story
Delete

Cursor CEO Warns Vibe Coding Builds 'Shaky Foundations' That Eventually Crumble

1 Comment
Michael Truell, the 25-year-old CEO and cofounder of Cursor, is drawing a sharp distinction between careful AI-assisted development and the more hands-off approach commonly known as "vibe coding." Speaking at a conference, Truell described vibe coding as a method where users "close your eyes and you don't look at the code at all and you just ask the AI to go build the thing for you." He compared it to constructing a house by putting up four walls and a roof without understanding the underlying wiring or floorboards. The approach might work for quickly mocking up a game or website, but more advanced projects face real risks.

"If you close your eyes and you don't look at the code and you have AIs build things with shaky foundations as you add another floor, and another floor, and another floor, and another floor, things start to kind of crumble," Truell said. Truell and three fellow MIT graduates created Cursor in 2022. The tool embeds AI directly into the integrated development environment and uses the context of existing code to predict the next line, generate functions, and debug errors. The difference, as Truell frames it, is that programmers stay engaged with what's happening under the hood rather than flying blind.
Read the whole story
jepler
6 days ago
reply
You know how every failed agile project wasn't doing agile right?
Earth, Sol system, Western spiral arm
Share this story
Delete

Sauropods

2 Comments and 3 Shares
Vertebrae Georg
Read the whole story
jepler
7 days ago
reply
Peak
Earth, Sol system, Western spiral arm
Share this story
Delete
1 public comment
alt_text_bot
7 days ago
reply
Vertebrae Georg

Ask Hackaday: What Goes Into A Legible Font, And Why Does It Matter?

1 Comment

Two patent front pages, on the left American with a serif font, on the right British with a sans serif font.
American and British patents, for comparison.

There’s an interesting cultural observation to be made as a writer based in Europe, that we like our sans-serif fonts, while our American friends seem to prefer a font with a serif. It’s something that was particularly noticeable in the days of print advertising, and it becomes very obvious when looking at government documents.

We’ve brought together two 1980s patents from the respective sources to illustrate this, the American RSA encryption patent, and the British drive circuitry patent for the Sinclair flat screen CRT. The American one uses Times New Roman, while the British one uses a sans-serif font which we’re guessing may be Arial. The odd thing is in both cases they exude formality and authority to their respective audiences, but Americans see the sans-serif font as less formal and Europeans see the serif version as old-fashioned. If you thought Brits and Americans were divided by a common language, evidently it runs much deeper than that.

But What Makes Text Easier To Read?

The font display page for the Atkinson Hyperlegible font.
Is this legible enough for you?

We’re told that the use of fonts such as Arial or Calibri goes a little deeper than culture or style, in that these sans-serif fonts have greater readability for users with impaired vision or other conditions that impede visual comprehension. If you were wondering where the hack was in this article it’s here, because many of us will have made device interfaces that could have been more legible.

So it’s worth asking the question: just what makes a font legible? Is there more to it than the presence or absence of a serif? In answering that question we’re indebted to the Braille Institute of America for their Atkinson Hyperlegible font, and to Mencap in the Uk for their FS Me accessible font. It becomes clear that these fonts work by subtle design features intended to clearly differentiate letters. For example the uppercase “I”, lowercase letter “l”, and numeral “1” can be almost indistinguishable in some fonts: “Il1”, as can the zero and uppercase “O”, the lowercase letters “g”, and “q”, and even the uppercase “B” and the numeral “8”. The design features to differentiate these letters for accessibility don’t dominate the text and make a font many readers would consider “weird”.

Bitmap Fonts For The Unexpected Win

The typeface used in the Commodore 8-bit machines. User:HarJIT, Public domain.

It’s all very well to look at scaleable fonts for high resolution work, but perhaps of more interest here are bitmap fonts. After all it’s these we’ll be sending to our little screens from our microcontrollers. It’s fair to say that attempts to produce smooth typefaces as bitmaps on machines such as the Amiga produced mixed results, but it’s interesting to look at the “classic” ROM bitmap fonts as found in microcomputers back in the day. After years of their just flowing past he eye it’s particularly so to examine them from an accessibility standpoint.

Machines such as the Sinclair Spectrum or Commodore 64 have evidently had some thought put into differentiating their characters. Their upper-case “Ii” has finials for example, and we’re likely to all be used to the zero with a line through it to differentiate it from the uppercase “O”. Perhaps of them all it’s the IBM PC’s code page 437 that does the job most elegantly, maybe we didn’t realise what we had back in the day.

So we understand that there are cultural preferences for particular design choices such as fonts, and for whatever reason these sometimes come ahead of technical considerations. But it’s been worth a quick look at accessible typography, and who knows, perhaps we can make our projects easier to use as a result. What fonts do you use when legibility matters?

Header: Linotype machines: AE Foster, Public domain.

Read the whole story
jepler
9 days ago
reply
I certainly know I clung to bitmap fonts for my terminals for exactly this reason: every. single. pixel. was put where it was for legibility. Even in tiny fonts, like 5x7 pixels! However, eventually other concerns like extensive unicode support became marginally more important and I gave up. Not to mention that my everyday display's DPI went way up, while my eyes go way worse, so those 5x7 pixel characters eventually became illegible anyway. Back in the days of 640x480 LCD screens was a different matter.
Earth, Sol system, Western spiral arm
Share this story
Delete

Microsoft To Replace All C/C++ Code With Rust By 2030

1 Comment
Microsoft plans to eliminate all C and C++ code across its major codebases by 2030, replacing it with Rust using AI-assisted, large-scale refactoring. "My goal is to eliminate every line of C and C++ from Microsoft by 2030," Microsoft Distinguished Engineer Galen Hunt writes in a post on LinkedIn. "Our strategy is to combine AI and Algorithms to rewrite Microsoft's largest codebases. Our North Star is '1 engineer, 1 month, 1 million lines of code.' To accomplish this previously unimaginable task, we've built a powerful code processing infrastructure. Our algorithmic infrastructure creates a scalable graph over source code at scale. Our AI processing infrastructure then enables us to apply AI agents, guided by algorithms, to make code modifications at scale. The core of this infrastructure is already operating at scale on problems such as code understanding."

Hunt says he's looking to hire a Principal Software Engineer to help with this effort. "The purpose of this Principal Software Engineer role is to help us evolve and augment our infrastructure to enable translating Microsoft's largest C and C++ systems to Rust," writes Hunt. "A critical requirement for this role is experience building production quality systems-level code in Rust -- preferably at least 3 years of experience writing systems-level code in Rust. Compiler, database, or OS implementation experience is highly desired. While compiler implementation experience is not required to apply, the willingness to acquire that experience in our team is required."
Read the whole story
jepler
9 days ago
reply
Oh yeah just have a stochastic process convert C code to Rust with no real understanding. it'll be fine.
Earth, Sol system, Western spiral arm
zwol
9 days ago
It's gonna be Annex K all over again but worse.
jepler
9 days ago
I assumed hating Annex K was just a generalized part of my Microsoft hate but turns out, no, it really is terrible. https://www.open-std.org/jtc1/sc22/wg14/www/docs/n1969.htm
jepler
9 days ago
.. but instead of removing it they waited 6 years and then wrote a document to maybe address one of the problems (https://www.open-std.org/jtc1/sc22/wg14/www/docs/n2809.pdf - Fix set_constraint_handler_s in multithreaded environments) in a way that is surely not actually backwards compatible. They want to change the setting to be per-thread without introducing a _r variant...! madness. It does not seem to have ended up accepted in C23. Nevertheless, slibc has made it thread-local, in violation of annex K as written
zwol
9 days ago
Yeah, and none of that even gets at the most elemental issue: the _goal_ of Annex K was to create a set of functions that you could _mechanically convert_ your legacy C code to use and it would no longer have buffer overflow bugs. Turns out that doesn't work. Fixing overflow bugs requires actual human attention at each site. The best a mechanical Annex K conversion can do is turn RCE bugs into denial of service and silent data corruption bugs. ... And that's why I brought it up; "mechanically get rid of all of the C in Windows by 2030" is chasing the same impossible dream.
jepler
9 days ago
or you can look at the endless nothing that Safe C++, cppfront, profiles, and CPP Core Guidelines have actually done for any open source project I've actually seen in the wild.
zwol
8 days ago
Arguably, the most important reason why Rust code tends not to have memory safety bugs is not the specific details of Rust's memory safety rules, but just the fact that it *has* memory safety rules which are enforced by default and you have to go out of your way to bypass them.
Share this story
Delete
Next Page of Stories