Building an AMD-based PC

It’s time to retire my 10.5-year-old desktop PC, which isn’t able to run Windows 11.

Much as I hate to abandon a company that has been passionate about DEI, I think it is time to switch to the AMD side (way better for gaming, which I’m not allowed to do; somewhat better for productivity).

Workload:

  • Adobe Premiere (not very frequently)
  • photo editing
  • training some AI models (if nothing else, I want to train and run a local AI model for photo library search)
  • general Web browsing
  • Zoom and Teams for work
  • Microsoft Office

Dreams:

  • 16 TB M.2 SSD (nobody seems to make this and thus the build below is what I think is the best 8 TB)
  • as many USC-C ports as possible (3 on the back and 1 on the front seems to be the limit; ASR LiveMixer motherboard below was picked to get beyond the standard 2 on the back)
  • reasonably compact case (currently have a Fractal Design Define 7 that is quiet, but absurdly huge)
  • quiet
  • built-in UPS that can handle outages of up to 30 seconds (typical Florida power outage is just a few seconds; I guess a 1-minute supply would be necessary to allow the machine to shut down gracefully if power is still out after 30 seconds; nobody makes this because consumers see that they can get 30 minutes out of an inexpensive desk-cluttering standard external UPS?)
  • built-in CD/DVD reader (will give up for compactness and plug in via USB-C)
  • built-in reader for SD and CFExpress cards (these don’t seem to exist either for 5.25″ or 3.5″ slots; there are some cheap/old readers that fit into 5.25″ slots that read old CF cards, but not CFExpress?)

Here’s my proposed build, with no case:

  • CPU: AMD Ryzen 9 9950X3D 4.3 GHz 16-Core Processor ($671.99 @ Amazon)
  • CPU Cooler: Cooler Master Hyper 212 Black Edition 42 CFM CPU Cooler ($29.99 @ Amazon)
  • Motherboard: ASRock X870 LiveMixer WiFi ATX AM5 Motherboard ($229.99 @ Amazon)
  • Memory: Corsair Vengeance 128 GB (2 x 64 GB) DDR5-6400 CL42 Memory ($359.99 @ Amazon)
  • Storage: Samsung 9100 PRO 8 TB M.2-2280 PCIe 5.0 X4 NVME Solid State Drive
  • Storage: Seagate BarraCuda 24 TB 3.5″ 7200 RPM Internal Hard Drive ($249.99 @ Newegg)
  • Storage: Seagate BarraCuda 24 TB 3.5″ 7200 RPM Internal Hard Drive ($249.99 @ Newegg)
  • Video Card: Asus PRIME GeForce RTX 5080 16 GB Video Card ($999.99 @ Amazon)
  • Power Supply: Corsair HX1000i (2023) 1000 W 80+ Platinum Certified Fully Modular ATX Power Supply ($239.99 @ Newegg)
  • Monitor: Samsung Odyssey Neo G95NC 57.0″ 7680 x 2160 240 Hz Curved Monitor ($1499.99 @ Abt)
  • Total: $4531.91

Questions:

  • what is the best case? It would be nice if it can hold one or two addition 3.5″ drives (maybe just move a couple from my old PC), but this isn’t essential
  • do I want the heat sink on the Samsung 8 TB M.2 SSD? It’s almost free and yet they sell the device with and without the heat sink (for mechanical fit?)
  • what is the right video card to get? I think RTX 5080 is what I want and I think that it will drive the crazy huge double-4K monitor, but I have no idea which brand video card makes sense (the ASUS was picked due to being reasonably cheap and available)
  • is the motherboard pick the right one? I might want to add a second M.2 drive some day. I can live with a max of 256 GB of RAM, I think
  • any other improvements?
Full post, including comments

20th anniversary of my bad idea for using a mobile phone as a home computer

Twenty years ago this month (i.e., two years before the iPhone), I wrote “Mobile Phone As Home Computer”:

What would you call a device that has a screen, a keyboard, storage for personal information such as contacts, email, documents, the ability to play audio and video files, some games, a spreadsheet program, and a communications capability? Sound like a personal computer? How about “mobile phone”?

A mobile phone has substantially all of the computing capabilities desired by a large fraction of the public. Why then would someone want to go to the trouble of installing and maintaining a personal computer (PC)? The PC has a larger keyboard and screen, a larger storage capacity, can play more sophisticated games, and has a faster communications capability.

If you are an architect and want to run a computer-aided design program, the PC is great. If you are an electrical engineer and want to design circuits, a PC is great. If you are a filmmaker and want to edit video, a PC is great. For all of these customers it would be difficult indeed to supplant the PC. For a large segment of the market, however, the PC represents confusion, misery, and wasted hours.

The PC is a scaled-down circa 1965 mainframe. The hardware engineers have done a brilliant job in changing the way that the circuits are constructed. The software engineers, unfortunately, have presented today’s consumer with much of the same complexity that professional programmers faced in 1965.

Why was it a bad idea? It’s been 20 years and nobody has been successful in the marketplace with anything like this. At least one prediction, though, has come true:

The PC industry, however, is seemingly unable to change. Nothing has been done to address the havoc wreaked on users except to build better desktop search tools for finding those lost files more quickly. You would think that the success of programs such as iTunes, MusicMatch, and Windows Media Player, which present a multi-categorized view of files in the underlying hierarchical file system, would inspire the authors of other PC programs but this seems not to have been the case.

The latest versions of MacOS and Windows 11 are more or less the same as they were 20 years ago, thus leading my unbiased team of fact-checkers (me and Mindy the Crippler, our golden retriever) to rate the prediction #MostlyTrue.

It would be substantially easier to implement my 2005 idea today because processing capability in the “dock” (a standard PC in disguise) would no longer be necessary for most users. And there are, in fact, a few unsuccessful products that have been built. From 2021, “5 laptop docks that let you use a smartphone like a notebook”:

Full post, including comments

Advice to a young unemployed computer science graduate: use freelance work to build a portfolio

I’ve recently spoken to a few young CS graduates who can’t find jobs. These are folks who got their bachelors degrees at schools one tier down from the super elite, e.g., University of Michigan rather than Stanford. I.e., a similar situation to what was recently covered in a New York Times article about a Purdue CS graduate who couldn’t find a job better than Chipotle and an Oregon State University graduate who applied to more than 5,700 positions.

Hiring a fresh CS graduate is risky for an employer because universities teach students how to work for an engineer, not how to be an engineer. Assignments in CS undergrad come in neat packages in which everything is doable within the allotted time. Engineering starts with talking to a customer to find out what is wanted/needed and then figuring out what is doable and which capabilities should be scheduled into which release of a program (the highest value and easiest-to-build capabilities go in v1.0). An employer thus has no idea whether a fresh CS graduate has or will ever develop any of the skills required to be useful. (I remember helping one very capable MIT graduate whose customer was unhappy with him. He’d spent all of his time on a paid project refactoring and reengineering code such that it was, in his view, more maintainable. He had let all of the customer’s requested features slip. The work that he’d done on the internals was invisible and undetectable to any user, admin or otherwise. He didn’t understand why he wasn’t a hero.)

What advice did I have for young people stuck in this situation? I advised against trying to cram for the puzzle tests that the most sought-after employers use as screeners. I advised signing up for freelance projects on Upwork or similar, charging nominal amounts if necessary to win clients, and then using the freelance projects to put together a portfolio. “If you were going to build a house would you hire an architect without looking at a portfolio of previous houses that the architect had designed?”

I’m not sure that I have found on the Web an example of what would be persuasive. https://benscott.dev/ is great from a visual/design point of view, but it doesn’t show the client’s perspective. I would prefer to see a portfolio that includes a photo of the client and what was the essence of the original request and then some screen shots showing that the client-requested features actually were developed. Finally, the project blurb should contain something about which tools were used, e.g., MySQL/Node.js or SQL Server/Microsoft .NET/C#.

Readers: What else would you say to a recent BSCS grad who is applying everywhere and getting interviewed almost nowhere?

Separately, if all else fails I think there are plenty of jobs selling marijuana in New York City, with at least 15 shops within 3 blocks of my Lower East Side hotel.

Full post, including comments

Memory prices flat for 18 months

We supposedly live in a world where electronics get cheaper even if everything else is subject to rampant inflation. Here’s an example from PC Part Picker’s memory price trend for 18 months:

Most of the other graphs are flat as well. If you adjusted these for official CPI there would perhaps be a slight downward trend in real dollars.

Why are memory prices more or less stuck at 2023 levels? Is it that fewer companies are making RAM? That the AI Boom (TM) has increased demand? (economics proves that immigrants don’t drive up prices for housing, but Econ 101 says that demand for memory drives up prices for memory)

Related:

Full post, including comments

$2449 of e-waste thanks to Microsoft (and best way for kids to organize and sort photos?)

Here’s my January 11, 2017 order for a Dell laptop computer with OLED screen:

The machine supports Trusted Platform Module 2.0, but the CPU per se isn’t supported by Microsoft for Windows 11. I had hoped to repurpose this machine as a digital photo organizer for the kids, but Windows 10 security updates will end later this year so that’s infeasible.

Is the de-supporting of Windows 10 going to be the largest e-waste event in the history of humanity? What’s Greta Thunberg going to say about this? (Maybe after shouting “Free free Palestine” she would say “Install Linux”?)

This raises a question… what is a good system for kids to use to organize photos taken with a modern camera? My preference is for the organizer to run locally with the photos stored on the laptop’s SSD with a cloud backup (maybe just Microsoft OneDrive if we stay with Windows) rather than sign them up for life to pay huge fees every month for cloud photo storage.

Could ACDSee be the modern answer to what we lost when Google discontinued (and failed to open source) Picasa? Or is the built-in Windows 11 Photos app sufficient? ChatGPT says that MacOS has a better photos app:

Maybe the kids are young enough to master ChromeOS (for skool), MacOS, and eventually Windows? I don’t love the idea of having to learn enough about MacOS to support their efforts, but it does seem that Apple is more serious about this challenge. Windows 11 runs like a pig on my three-year-old laptop, which cost $1700 and has 16 GB of RAM. I can’t figure out if it is Dropbox, Box, OneDrive, or the Microsoft Photos app that is causing the problem (if I do an “End Task” on Photos the machine seems to come out of its sluggish state).

Speaking of avoiding e-waste, Boise, Idaho offers an awesome model for other cities: Reuseum. In addition to classes for kids, they offer refurbished Windows 11 machines at low prices, e.g., these machines that could use more RAM for $80:

Plus if you want to make a sculpture out of old PCs and telephones you can buy them by the pound:

Full post, including comments

Apple in China, the rise of iPod

Second post regarding Apple in China: The Capture of the World’s Greatest Company by Patrick McGee. This one is about Apple’s shift from making computers to making handheld devices. (See Apple in China book, Intro if you missed Post #1 about this book.)

… just a month after the launch of iTunes [January 2001], hardware chief Jon Rubinstein—aka Ruby—and procurement head Jeff Williams were in Japan and stopped by Toshiba. The Japanese supplier showed them a new hard drive, just 1.8 inches in diameter, with a massive 5 gigabytes of capacity. Toshiba didn’t really know what to do with it, but to Ruby, the implications were “obvious” immediately: this thing could hold a thousand MP3s! It was the enabling technology they needed. “Jeff,” Ruby quietly said, “we need to get all of these.” Williams negotiated an exclusive supply agreement as Ruby made sure the $10 million check they drew up wouldn’t bounce.

Rubinstein and Fadell would later dispute who the key figure was behind the hit MP3 player, but the truth is that its brilliance had multiple authors, reflecting how each domain in the pyramid structure (ID, PD, MD, and Ops) worked on their specialty simultaneously. Ruby had found Toshiba’s disk drive and realized its potential. Phil Schiller, of marketing, introduced the idea of the scroll wheel—probably the feature most loved by consumers, as it reacted to the velocity of each turn and enabled them to race through hundreds of songs in a matter of seconds. Fadell was the overall architect. He presented to Jobs a prototype made from foam core and stuffed with old fishing weights to give it some heft. Jony Ive’s team made it unapologetically white, with a polished, chrome-like stainless steel back, a remarkably sharp turn from the childlike colors of the iMac. It was an unusually high-end material for a mass-market product, giving it a feel unlike any other handheld device. It was also durable and could dissipate heat more effectively than plastic.

The MP3 player would remain nameless for months, until four people in branding tossed ideas back and forth with Jobs. Vinnie Chieco, a creative director, recalls how the team would write down every permutation and then sort them into three piles: the worst, the ones that suck, and the not horrible. He’d come up with one: Troubadour, named after French poets who went from town to town playing music. This thing, too, was mobile, could travel and play music. The metaphor worked. The name didn’t. Jobs had his own preferred moniker, which Chieco remembers but won’t share. Like MacMan—what Steve wanted to call the iMac—his idea wasn’t very good, and Chieco is hesitant to share something now that Jobs can’t defend. The other three people in the room told Jobs they loved his name for the device, perhaps trying to avoid his infamous wrath. But when Jobs asked Chieco for his opinion, the creative director said, “Well, I understand your name is novel, but…” Feeling as if he were putting his head in a guillotine, Chieco told Jobs the reasons he didn’t like it. Meanwhile, he kept thinking in metaphors. He was struck by the all-white design, which looked space-like. Riffing on Jobs’s idea that a Mac computer was the “hub for your digital life,” he considered how in the future, the ultimate hub would be the mother ship. The only way to escape would be in a pod that flies away for temporary adventures, returning to replenish and recharge. He got the idea from 2001: A Space Odyssey, and hey—now it was 2001! It felt serendipitous, like when the Macintosh emerged in the Orwellian year, 1984. He proposed Pod. Jobs didn’t hate it, and over a few meetings it grew on him until it became the obvious name. It just needed one tweak, one letter, and then it was perfect: iPod.

Why did Apple make a phone? It was obvious to everyone that consumers wouldn’t want an iPod once reasonably capable smartphones were ubiquitous. Profits from Apple computers were insignificant compared to profits from the mass market iPod.

Around mid-2005, another project began to gain traction internally. The interfaces team had been toying with multi-touch technology for roughly two years, aided by a start-up Apple had purchased called FingerWorks. Senior engineers from Project Purple knew about it, but the original concept was about rethinking the Mac’s interface. When Steve Jobs first showed Fadell the technology, asking if it might work for a phone, it was far from obvious that the enormous contraption Jobs pointed to was the future of something that would sit on your desk, let alone be shoved in your pocket. “It filled the room,” Fadell recalled. “There was a projector mounted on the ceiling, and it would project the Mac screen onto this surface that was maybe three or four feet square. Then you could touch the Mac screen and move things around and draw on it.”

Meanwhile, the fear that the iPod business would be cannibalized by the phone giants continued to fuel anxiety and innovation. “It was an existential crisis,” a senior engineer says. “[We were saying], ‘You realize what’s gonna happen here is this business we built on iPods is going to go away. We need to build a phone.’ ” Jobs eventually canceled the other phone ideas and declared multi-touch the future. He was adamant there’d be no keyboard, so the phone would be as full screen as possible. Apple’s engineers suddenly had to find suppliers that could build multi-touch displays at scale—something that didn’t exist at the time. There was no way Apple could send the specs to some factory and wait for the parts to be built; instead, it sent teams of engineers to Japan, Korea, Taiwan, and China to find hungry vendors it could work with to co-create the processes. “There were a few truly groundbreaking mass production processes we were involved with, where we really had to go around to find the best people in the entire world—the peak of what humans have developed for some of these technologies,” says a product manager. By early 2006, they had a full-screen prototype enclosed in brushed aluminum. Jobs and Ive “were exceedingly proud of it,” journalist Fred Vogelstein would later recount. “But because neither of them was an expert in the physics of radio waves, they didn’t realize they’d created a beautiful brick. Radio waves don’t travel through metal well.”

(I don’t understand how “cannibalized by the phone giants” made it through the purported editing process of this book. In business, cannibalized refers to a reduction of sales of Product A after the company that makes Product A introduces Product B. In the context of Apple, the iPhone might cannibalize sales from the iPod or a notebook-format Macintosh might cut into sales of desktop Macs rather than take sales away from IBM PCs.)

The iPhone required a lot of new manufacturing techniques, mostly developed by vendors in China and Taiwan, often with significant help from Apple engineers who’d fly over from California.

Another important supplier was TPK, which placed a special coating on the Corning glass, enabling the user’s fingers to transmit electrical signals. The Taiwanese start-up had been founded just a few years earlier by Michael Chiang, an entrepreneur who in the PC era had reportedly made $30 million sourcing monitors and then lost it all on one strategic mistake. In 1997 he began working with resistive touch panels used by point-of-sale registers. When Palm was shipping PDAs that worked with a stylus, Chiang worked on improving the technology to enable finger-based touchscreens, even showing the technology to Nokia. But nobody was interested until 2004, when a glass supplier introduced TPK to Apple. An iPhone engineer calls Chiang “a classic Taiwanese cowboy [who] committed to moving heaven and earth” by turning fields into factories that could build touchscreens. The factory was in Xiamen, a coastal city directly across from Taiwan. “The first iPhones 100 percent would not have shipped without that vendor,” this person says. He recalls Chiang responding to Apple by saying, “ ‘We can totally do that!’—even though [what we were asking was something] nobody in the world had ever done before.” Among the techniques Apple codeveloped with suppliers was a way to pattern, or etch, two sides of a piece of glass to do the touch sensor, at a time when film lithography processes were being done on only one side. Another pioneering technique is called rigid-to-rigid lamination, a process for bonding two materials using heat and pressure, which Apple applied to tape a stack of LCD displays to touch sensors and cover elements to create one material. The process was performed in a clean-room environment with custom robotics.

Instead of selecting components off the shelf, Apple was designing custom parts, crafting the manufacturing behind them, and orchestrating their assembly into enormously complex systems at such scale and flexibility that it could respond to fluctuating customer demand with precision. Just half a decade earlier, these sorts of feats were not possible in China. The main thing that had changed, remarkably, was Apple’s presence itself. So many of its engineers were going into the factories to train workers that the suppliers were developing new forms of practical know-how. “All the tech competence China has now is not the product of Chinese tech leadership drawing in Apple,” O’Marah says. “It’s the product of Apple going in there and building the tech competence.”

We might owe most of our current toys to Apple’s 2010 agreement with TSMC, motivated by a desire to reduce its dependence on Samsung:

In 2010, Apple operations chief Jeff Williams reached out to Morris Chang through his wife, Sophie Chang, a relative of Terry Gou. Dinner between them launched months of “intense” negotiations, according to Chang, as Williams pressed TSMC on prices and convinced the Taiwanese group to make a major investment. “The risk was very substantial,” Williams recalled at a gathering for TSMC’s thirtieth anniversary in 2017. “If we were to bet heavily on TSMC, there would be no backup plan. You cannot double-plan the kind of volumes that we do. We want leading-edge technology, but we want it at established technology… volumes.” Williams’s narrative leaves out some of the most interesting facts about the early partnership. One is that Chang wouldn’t commit to Apple’s demands. In a 2025 interview with the podcast Acquired, Chang said that TSMC would’ve had to raise substantial amounts of money, either by selling bonds or issuing more stock. Williams had another idea: “You can eliminate your dividend.” Morris balked at the aggressive suggestion. “If we do what Jeff Williams says, our stock to going to drop like hell,” he recounted. Chang agreed to take only half of Apple’s order. Even this partial commitment forced TSMC to borrow $7 billion, so it could invest $9 billion and devote 6,000 full-time employees working round the clock to bring up a new chips fab in eleven months, according to Williams. “In the end, the execution was flawless,” he said. The partial commitment forced Apple to toggle between Samsung and TSMC, which some in Cupertino saw as a plus—it meant that Apple wasn’t beholden to just one supplier for what serves as the brain within the iPhone. But Srouji’s team found it nightmarish to manage both suppliers. So Apple turned to TSMC on an exclusive basis, establishing over-the-top contract terms to protect itself. A person familiar with the contract characterized it as saying: “We need to make sure that you’re gonna go out of business—if you’re gonna put us at risk of going out of business.” It was a “mutually assured destruction” type of situation, this person says, because if TSMC didn’t perform in any given year, there’d be no iPhone. So the Apple decision was made: “We are going to put all of our eggs in one basket, and then we’re gonna guard the basket.” TSMC’s bet would prove critical for making it the world leader in semiconductor fabrication, with Apple as its

Full post, including comments

The long dark winter is finally over

February 2024, regarding a tragedy that began in 2023: Microsoft keyboards back from the dead.

After massive daily injections of healing Paxlovid, the Sculpt keyboard has risen! Amazon now stocks the Incase “Designed by Microsoft” keyboards.

Get yours before the 6,000 percent tariffs kick back in (the case is stamped “Made in China”, almost surely by the same factory that Microsoft used).

The new supplier’s site:

Full post, including comments

Learn to Code

I don’t know if Joe Biden is dead or alive right now, but I have fond memories of his 2019 career advice to coal miners:

Suppose that a high school student took Joe Biden’s advice in 2019 but skipped the coal mining phase. He/she/ze/they will graduate from college in 3 months with a CS degree. We randomly selected this person so he/she/ze/they will have median skills as an entry-level computer programmer (“coder”).

Let’s hear from an LLM expert to get some insight into what the demand for a median-skilled programmer is likely to be…

Full post, including comments

One year until many of the world’s PCs can be taken over by a 10-year-old?

As the righteous in Tim Walz’s Minnesota, California, and Maskachusetts observe Indigenous Peoples’ Day and deplore the time when a hated Jew inflicted a Nakba on the entirely peaceful Native Americans, let’s look at the 21st century way to conquer a big part of the world.

From Microsoft, back in June:

I think this is because my desktop computer was running a CPU/chipset/motherboard configuration from 2015 that lacks some modern security features, such as a Trusted Platform Module.

I wonder if a lot of people won’t upgrade despite Microsoft’s threats. After all, improvements in computer hardware have slowed to a crawl (see GPU performance improvements since 2015 (and why not just use motherboard graphics?) in which we learn that the 2015 CPU had reasonable performance by 2022 standards. With Microsoft not bothering to continue with security updates and nearly every PC connected to the Internet, will a smart 10-year-old be able to take over a substantial fraction of the world’s computers? This is not an open-source computer program that folks other than Microsoft can debug and patch.

Circling back to Indigenous Peoples’ Day… remember that immigration wasn’t the best thing that ever happened to Native Americans, but Science proves that immigration will be the best thing that has ever happened to Native-Born Americans.

Full post, including comments