> TSMC's A14 is brand-new process technology that is based on the company's 2nd Generation GAAFET nanosheet transistors and new standard cell architecture to enable performance, power, and scaling advantages. TSMC expects its A14 to deliver a 10% to 15% performance improvement at the same power and complexity, a 25% to 30% lower power consumption at the same frequency as well as transistor count, and 20% - 23% higher transistor density (for mixed chip design and logic, respectively), compared to N2. Since A14 is an all-new node, it will require new IPs, optimizations, and EDA software than N2P (which leverages N2 IP) as well as A16, which is N2P with backside power delivery.
Sure, being an essential part of the global supply chain for tech is important. It's also important to show support for Taiwan, to convince Japan, South Korea, etc from arming with nukes. That could set off a chain reaction in which everyone who is close says "f*ck it, I guess if everyone else is doing it..." The veneer of the US security umbrella folds and everyone suddenly feels they need to build (or retain) the bomb to protect themselves (like Ukraine failed to do in giving up theirs.) Now everyone needs MORE nukes because you have a LOT more targets.
NATO doesn't consider Ukraine as significant because they have vital tech they supply globally. Rather, NATO is concerned about an aggressive regional power that may have aims on more than just Ukraine.
If china succeeds in leapfrogging ASML. With their particle accelerator light source it likely won’t matter. They will have a home engineered piece of the solution.
they wont. full stop. they've even admitted as much in industry. they'll be extremely happy if they're (only!) a few years behind tsmc
at this point, what will be a real earthshaker is if china manages to get past 7nm. smic has gotten a long way but even that company's 7nm process has serious limitations (much higher cost and worse yields)
and anyways, aside from a handful of use cases (ai being one tbh), 7nm chips are more than viable for any general purpose task. a "leapfrog" is quickly diminishing from a "need" to a "want", and the resulting governmental support is fading as well. of course, it'll still be a high priority for the chinese, but it's not like top of the list.
You are two years behind in your assessment, and technology moves pretty fast especially when you're trying to catch up.
You could buy phones with SMIC 7nm chips as early as mid 2024, that means yields were good enough around mid 2023.
This indicates they'd be on track to do 5nm this year, which is what the news articles indicate. The impressive part is that this is catching up with ASML+TSMC combined. There's no other company or government in the world that has achieved this vertical in the last few decades. China is willing to sink Manhattan project level resources into this, for good reason.
Instead of using vague abstractions like "pouring unlimited money" and "hiring the right people", I'd hope that predictions would be predicated more on the actual specifics of the progress being made, i.e what are specific engineering problems to be overcome and what is their progress in doing so. If it's not, it's really just astrology one is using.
The Chinese perspective itself certainly isn't anything close to these vague abstractions, outside of vague anti-western polemics or nationalist chest-beating. After all with the same logic we're pouring a "manhattan's worth of funding" into "AI", dosen't mean we're going to be reaching Gen-AI anytime soon!
Industrial espionage is a thing. China has so far managed to get every tech they ever wanted and I don't see why EUV could not be stolen. Everything, even very advanced technology, can be reverse engineered, especially if you already know conceptually what it does. Software and data (and blueprints) can be stolen as well. ASML and TSMC have a lot of security in place but at least on HN I would assume everybody knows that that does not guarantee perfection. If the knowledge is out there, it will spread.
Isn’t the point that other countries would have to think twice before letting China get their hands on Taiwan? The advantage to China would be immense if they secured Taiwan.
There is another alternative that is much more likely: China gets to or at near parity and then no longer needs to get their hands on Taiwan. At that point they could just as easily destroy it as that they would want to occupy it. And that is a lot simpler. As long as Taiwan has an edge they are safer than when they are a commodity.
Kinda like Hong Kong when it came to the finance industry. Agree mostly. Instead of destroy, I would say China wouldn’t need to/care to maintain their fine balance/
special relationship with Taiwan anymore and would throw their weight around more.
You're assuming that China is afraid of "other countries", or that those countries can do anything to stop them. Truth is, China right now is challenging the US for global dominance, they are afraid of noone. They don't need anyone's permission to "get their hands on Taiwan".
No, I’m assuming that they’ll want to get their hands on Taiwan with as little mess as possible and keep the assets intact if possible. If their only choice was to start a global war, I don’t think they’ll do it regardless if they would win or not.
Well, that assumption might be wrong as well... Yes, taking control of the TSMC fabs is the best possible outcome for China, but destroying them is also not so bad. It limits the supply of bleeding edge semiconductors to the Western world, giving Huawei and other Chinese companies more time to catch up.
WW3 would not be sanctions rivaling those Russia are under. WW3 would be zero trade, submarines sinking the shipping etc. What Russia is under is just sanctions, not real war measures.
This is true but it has been fascinating seeing them trying to catch up. Necessity being mother of invention, all of that.
It is still hard to get much of a clear picture on how well they are doing on this stuff. Chinese companies are saying they are near parity, the opposition says they are 15 years behind, the reality is probably somewhere in between.
While they have made a lot of quick progress, that is no guarantee for future gains.
My worry is that they're not bringing anything new to the table except "this is actually possible if you will it".
On the flipside Canon's Nanoimprint Lithography promises lower cost, even if their feature size is 15nm in practice. They also appear to be having competition now:
Recently Canon shipped the first commercial device and it appears that this path will see continued development. 15nm(14 advertised) is already good enough for, say, automotive applications.
MIC2025 made china manufacture over 90% of all semi domestically. They have replacements/clones of every jelly bean part imaginable. China can keep manufacturing like nothing happened if/when Taiwan is erased from the map.
They have a cost advantage. Kinda fine for where I'm standing; if we want to have more investment, we must liberalize migration! If we don't liberalize migration, necessarily the capital-labor ratio will be more capital-scarce in other countries.
I don't think it's easy to migrate to Taiwan (it's very unlikely that it's easier than migrating here, most East Asian countries make that difficult) so that doesn't seem to actually be a prerequisite.
We already have a pretty serious unemployment problem among college graduates so something else seems to be going on (a problem with domestic universities?)
Unemployment is actually a symptom of inelasticity(?) of wages. Some people become unemployed instead of getting a minimum wage job. Until Americans can subsist on a 1k monthly salary, the cost advantage in Taiwan is real.
I blame the American corporate meme. American corporations are hideously slow, lumbering and quite honestly many are just "too big to fail" prop ups at this point. Long gone are actual qualified individuals running even semiconductor manufacturers and its just bean counters and country club nephews.
Pretty sure Intel failed because of the opposite, they were too ambitious in attempting to do both fab and design that they out outmanuvered by more nimble companies like AMD and TSMC.
American corporations are what created "Silicon Valley" in the first place. America is not slow, and it's definitely not "too big to fail" as the current administration is trying to make it fail, but that is an aside.
I think America doesn't manufacture semiconductors because it is a very unclean process, full of nasty chemicals. It's expensive to make semiconductors and deal with the clean-up. There are less environmental restrictions and cheaper labor in other parts of the world.
There are a bunch of Superfund sites around Mountain View, CA that serve as a reminder about the US Semiconductor industry - Fairchild Semiconductor, Intel, National Semiconductor, Monolithic Memories, and Raytheon to name a few.
Nobody in the U.S. really wants that in their back yard. Of course we've seen the same kind of thing from fracking, and everything else that rightly should be regulated or banned.
What happens now with a defunded and purposefully dysfunctional EPA is anyone's guess. Maybe manufacturers will exploit the political climate to further destroy the environment to make a few more million or billion dollars.
> I think America doesn't manufacture semiconductors because it is a very unclean process, full of nasty chemicals. It's expensive to make semiconductors and deal with the clean-up. There are less environmental restrictions and cheaper labor in other parts of the world.
> Nobody in the U.S. really wants that in their back yard.
Disagree. I worked in Intel's flagship semiconductor r&d division in Oregon in the 2010s.
Everybody wanted Intel in their backyard. It was a huge source of high paying and stable jobs, both for engineers with PhDs, like me, and for thousands of technicians and support staff.
There were protected wetlands on Intel's campuses, and parks and fields around them. Certainly Intel wasn't perfect from an environmental point of view, but it was not a high source of pollution.
I've worked at/with many other semiconductor fabs around the US and around the world, and they're mostly similar in this regard. Far "cleaner" than factories in many other industries.
“Silicon Valley” is more than just some of the post-war defense research and high tech weapons contracts. It includes the financing/fundraising, the talent pool, nearby university research, and the advantages of “fail fast” startup culture (including California’s jurisprudence). It didn’t become recognizable in its current form until after NASA’s Moon Shot project threw tons of funding at research and manufacturing of computer miniaturization.
This. There are a few areas in the US that have a long history of being incubators of engineering firms beyond Silicon Valley, all because of WW2 which had tons of money being spent to produce certain types of equipment in each area.
And of course, this is before the mega defense contractors that exist now. The military absolutely fucking hates those megacorps and does still try and actively fund new small business entrants to military contracting. The problem is mega corps buy them up as the US is owned by them.
>"The popularization of the name is often credited to Don Hoefler, the first journalist to use the term in a news story.[1] His article "Silicon Valley U.S.A." was published in the January 11, 1971"
"Silicon Valley" describes the period between the late 1960s and mid/late 1990s (and still to this day to some extent). It has nothing to do with what went on there around World War II. Yes, semiconductor corporations created "Silicon Valley".
Before that time it may have been a sort of "Vacuum Tube Valley", but that does not have the same ring to it. And around WW2 there was tech going on everywhere, not just around Mountain View.
I skimmed it pretty quickly, but it doesn't change the fact that nobody called it "Silicon Valley" until 1971. The article you sent me was about WW2, and military, so far as I could tell. Reading it wouldn't change anything about my statements.
It literally tracks the histories of the individuals who founded all the corporations people think of as belonging to "Silicon Valley". Things tend to exist for a while before they get a widely recognizable name, friend.
Silicon Valley as people think of it today (tech, talent, and capital) is generally considered to have come together in the early 1970s. That doesn't mean the ingredients weren't there before -- tech had been around for decades already, and the laws allowing free movement of talent go back to the late 1800s -- but by most accounts the early 1970s are when it all clicked. Note that there's no fact of the matter to be right or wrong about here, though.
If I had to pick a point in time as the beginning, I'd probably put it at the founding of either Kleiner Perkins or Intel (a couple years after or before the Silicon Valley moniker was coined, respectively). Before then funding mostly came from other companies. With Intel you have successful founders funding their own new company, and with Kleiner Perkins you have successful founders funding other founders. To me it isn't Silicon Valley until this dynamic emerges.
Thanks for letting me know you didn't read the article, and adding nothing to what is already said within it. If you go back and read it now, you may learn some things about how history differs from what is "generally considered" as you put it. Have fun!
I've seen Steve's talk. Like all historical accounts it's just a story. It pulls some details into the foreground and pushes the rest back. Other stories arrange the details differently, for example marking Silicon Valley's beginning quite a bit earlier, with the founding of HP, a decade before the Department of Defense existed. Steve's version isn't some transcendental truth, and people aren't wrong to disagree with it or with you.
Narrative and fact are two distinct aspects of history which work together. Portraying the heavily referenced and fact-laden linked article / talk as "just story" borders on dishonesty by intentionally ignoring the facts presented - the most interesting part. Who, What, When, Where, Why, and How.
> Steve's version isn't some transcendental truth
I don't see anywhere I make such a claim. "Silicon Valley" is a narrative. My point has been that the facts paint a deeper and more complex history than that narrative provides. Have a nice day!
TSMC's competitive advantage comes from Taiwan's unique willingness to look away from wanton dumping of used acid wash like it's the 80s in Silicon Valley? Or moderately more expensive labor on one of those highly automated factories with FOUPs zooming every which way?
Quite a bit of "Silicon Valley" was founded on outright theft from competitors. Now that the American industry is entrenched and "protect intellectual property" dominates over "improvement", falling behind other nations is inevitable.
One of the risks of any belief in American exceptionalism is that it hides the reality that there’s nothing special about America to have deserved its position in industry and commerce. There’s no special reason why it might not soon be someone else’s turn.
The military industrial complex, endowment funds from large colleges and academic research created Silicon Valley.
American corporations are fading into irrelevance through "financial management". Manufacturing powerhouses like Boeing and Intel are a shadow of their former selve and are really just coasting on inertia.
Pretty much everywhere you see "innovation", you will see government money. Look at the pharma industry. I doubt there's a drug out there that wasn't created by researchers using federal grants.
I'm often reminded of the story of Tetris. A handful of Soviets created the game. What was capitalism's contribution? Licensing agreements, sub-licensing agreements and so on. Put another way: building enclosures. That and rent-seeking is really all American corporations do anymore.
Software engineer salaries in the US are significantly higher than electrical and computer engineering salaries and have been for a whole. Most of the bright and ambitious EE and CE people went to faang companies in 2010s and probably earlier too.
I had a great job in R&D at Intel, in a department full of PhDs, in the 2010s, then jumped to another semi company from 2015-20.
Just before the pandemic in 2020, I got a job at AWS as a software engineer. It wasn't the only reason, but it was clear that I could make a lot more money in software.
I quickly became disillusioned with almost everything about how large software companies work, and now back working as a data scientist for an advanced manufacturing company 5 years later.
A big reason why TSMC is competitive on the global market is precisely that their wages are low.. Granted, they don't have a finance industry or big tech to draw away talent.
This is effective feature size and has little to do with actual geometry. Transistor size has barely budged in the last 10–15 years. The limitation is electrical, and it's not clear where that limit is. The smallest gate was built with an AFM out ~7 atoms; that's about 8 orders of magnitude smaller than a transistor, rn, and upwards of 9 than a stdcell. There's a LOT of room; we just don't know a good path to get to there.
"The smallest gate was built with an AFM out ~7 atoms; that's about 8 orders of magnitude smaller than a transistor"
I was thrown off by your statement, so here are some numbers: a modern chip like Nvidia's GH100 manufactured at a 5 nm process is 80 billion gates in 814 mm². That means a gate is 100 nm wide which is the width of 500 silicon atoms. On a 2D area that's 250k atoms. I don't know the thickness but assuming it's also 500 atoms then a gate has a volume of 125 million atoms.
So I guess you get your "8 orders of magnitude" difference if you compare the three-dimensional volume (7 atoms vs 125 million). But on one dimension it's only 2 orders of magnitude (7 atoms vs 500). And the semiconductor industry measures progress on one dimension so to me the "2 orders of magnitude" seems the more relevant comparison to make.
You're missing the key point, which is that the size referenced as the semiconductor manufacturing node is no longer an accurate description of the true transistor size, it's more of a marketing term.
Even if it's possible to build transistors that are 1.4nm in size (or smaller), that is not what "1.4nm" means in the context of this announcement. I get that this can be confusing, it's just a case of smoke and mirrors because Moore's Law is already dead and semiconductor manufacturers don't want to spook investors. The performance gains are still real, but the reasons for getting them are no longer as simple as shrinking transistor size.
As for the true physical limits of transistor sizes, there are problems like quantum tunnelling that we aren't likely to overcome, so even if you can build a gate with 7 atoms, that doesn't mean it'll work effectively. Also note that "gate" does not necessarily mean "transistor".
> That's clearly false, have a quick look at this chart
Feels like it's not detailed enough to make an assessment.
For example, if die size is being increased to counter the lack of improvements from transistor shrinking, it may technically meet the criteria set out in Moore's Law, but not in the sense that most people use it as a yard stick for performance improvements.
> if die size is being increased to counter the lack of improvements from transistor shrinking
This definitely doesn't fully explain everything that is happening. Die sizes aren't changing that fast.
Even if it is a part, you're ignoring that it is still a difficult technical challenge. It it weren't we'd see much larger dies. We have a lot of infrastructure around these chips that we're stacking together and the biggest problem in supercomputing and data centers is I/O. People would love to have bigger chips. I mean that's a far better solution than a dual socket mobo.
> This definitely doesn't fully explain everything that is happening. Die sizes aren't changing that fast.
Let's look at a real world comparison.
Based on information I can find online...
* Apple M1 silicon die was 118.91mm2, used TSMC 5nm process node, and had 16 billion transistors.
* Apple M3 silicon die was 168mm2, used TSMC 3nm process node, and had 25 billion transistors.
If you compare these two, you can see that the increased die size did allow for most of the improvements in transistor count. Even if it's not a completely like-for-like comparison, and is not necessarily always as straightforward as this, it's obvious to me that transistor count on it's own is a poor measure of semiconductor process improvements, and a much better measure is transistor density (e.g. comparing how many transistors can fit into a wafer of a fixed size, such as 100mm2).
I'm going to refer you to my other comment, because you did the same thing[0]
Second, let's actually check the math. Just divide the two numbers
- M1 135 million transistors per mm2
- M3 149 million transistors per mm2
So no, the increased die size did not allow for that. If we had the same ratio the M3 would have 22.7 billion transistors. (22.7-16)/(25-16) = 74.4%. I'll give you that this is "most" but 25% is not a rounding error. You gave yourself wiggle room, but that wiggle room is exactly the thing I was talking about and what you're arguing against.
Don't just cherry pick things from what people said and stroke your ego. Sometimes other people aren't completely clueless.
They also demonstrated knowledge of the very thing you're trying to explain to them. So if you think they are being rude or hostile to you, be aware that, even if unintentionally, you insulted them first. To be honest, even I feel a little offended because I don't know how you read that comment and come away thinking they don't know that 1) the 'x'nm number is not representative, 2) gains are coming from things other than literal physical size, 3) quantum tunneling. They actively demonstrated knowledge of the first two and enough that I suspect they understand the third. I mean it is a complex phenomena that few people understand in depth, but it is also pretty well known and many understand at a very high level.
From a third party perspective: it comes off like you didn't read their comment. I think you're well intentioned, but stepped on a landmine.
Personally I wouldn't argue Moore's law is dead but I do wonder about the cost per chip. WE can still push smaller for a lot longer but the costs are starting to creep up a bit.
I am not too worried though because adjusted for inflation, we are still saying a lot less for this tech than we were in pre-2000's tech.
Economy of scale used to drop prices, complexity of manufacturing will slowly increase them again.
Similar concerns were expressed around the development of VLSI tooling in the '70s. We've been on this curve for a long time. I can speak with some experience that manufacturing of steel and wood products increases similarly in cost and complexity with scale. Beasts like these exist: https://en.wikipedia.org/wiki/Heavy_Press_Program
As long as the scale of production is increasing, additional investments will be warranted.
What mrb is pointing out is that OP is comparing two different units. The 7 atoms is a count of atoms in a 3d space and is not size dimension. Comparing a count of atoms with physical size of a transistor is problematic.
So, it's more of an engineering problem than a physical one? I read somewhere a while ago about strange quantum effects activating at these scales too. What's the current state beyond 1.4 nm with our current knowledge?
Yes. But also what you read is correct too. There are quantum effects that need be accounted for at these levels (have been for quite some time). Both things can be true.
At the TSMC second-quarter earnings conference and conference call on Thursday, TSMC chairman C.C. Wei (魏哲家) said that after the completion of the company’s US$165 billion investment in the US, “about 30 percent of our 2-nanometer and more advanced capacity will be located in Arizona, creating an independent leading-edge semiconductor manufacturing cluster in the US.”
The Arizona investment includes six advanced wafer manufacturing fabs, two advanced packaging fabs and a major research and development center.
I mean, it could be - the highly filtered water could be re-filtered.
But unless it's cheaper to do so, or they're required by law to do so, they're just going to pump cleaner starting water out of the drinking supply and use that.
And good luck finding a city or state government that's not so desperate for big industry and tech jobs to arrive that they will hold their feet to the fire and demand they cut water use.
As TSMC and Taiwang government policy, they always build it first in Taiwan, run for some years and then build in the US. They keep Taiwan relevant and protected this way.
Yeah, who wouldn't invest locally first when there is an economic advantage to doing so? Their suppliers, talent base, and management are all there already.
TSMC announced new fabs in the US earlier this year. They need new fabs in Taiwan so nobody gets any ideas that TSMC could continue operations without a free Taiwan. Keeping Taiwan indispensable to the US is how they keep Chinese invasion plans in the planning state
Why would a free taiwan be necessary? I don’t think there ccp would have any qualms about tsmc continuing operation. A chinese company being the indisputed best at the modt advanced industry in the world is a good thing for them. Assuming a bloodless takeover occurred it would be business as usual.
The whole system that supports TSMC will break down in the event of a war.
You can see this with SMIC and their inability to get modern lithography systems from the only leading edge vendor ASML. Sure, you can create your own vendors to replace such companies, but they are unlikely to ever catch up to the leading edge or even be only a generation or two behind the leading edge despite massive investments.
With non-leading edge equipment & processes you have to make compromises like making much larger chips so you can get the same compute in a low power profile. This drives up the initial cost of every device you make and you run into throughput issues like what Huawei has experienced where they cannot produce enough ships to ship their flagship ship phones at a reasonable price and simultaneously keep them in stock.
Instead you get boutique products that sell out practically immediately because there were so few units that were able to be manufactured.
It seems very unlikely to me that between KMT loyalist troops and angry mobs that China would simply be allowed to take Taiwan without violence, and that nobody would decide to use TSMC as a hostage.
See the Swiss strategy, where every bridge and tunnel has its explosives pre-placed when it was built.
Fabs can and do recover from major contamination events. In 2021, Renesas suffered a fire that destroyed about 5% of their N3 building. It took them just under a month to resume production and just over 3 months to reach pre-incident production levels. Fab decontamination is a major task, but ultimately it's just a very, very thorough cleaning process.
Incidentally, they don't operate at BSL3 - that's a standard for biosafety that has more to do with protecting the outside world from the lab rather than vice-versa. Fabs operate in accordance with ISO 14644.
The implication I got from the GP comment is that the U.S. would be reluctant to have CCP manufacturing the processors due to the (proven) risk that they’ll modify and backdoor stuff.
If TSMC over invests in US factories then they could be taken over under imminent domain if Taiwan was no longer independent. So they have to keep a large portion of manufacturing domestic to Taiwan for lessened geopolitical risk.
You underestimate how gobsmackingly dumb this administration is, IMO. They've cancelled extremely important, multi-year or -decade long clinical trials just for funsies.
Thanks to all the investment due to AI we have been able to continue these improvement at the current rate. To put things into perspective an Apple 1.4nm A18 Pro ( 3nm ) would only use ~50% of the energy with the same performance.
I am hoping we have more to squeeze out from an IPC or PPA ( Performance per area ) metric. ARM seems to be in the lead in this area. Wondering if Apple will have something to show in A19 / M5.
NAND and DRAM side is a bit more worrying though. Nothing in the pipeline suggest some dramatic changes.
Edit: Not sure why I am getting downvoted every time I say it is AI investment leading to improvements. I guess some on HN really hate AI.
What advantage will a 1.4nm chip have over a 4nm one? What new capabilities will this tech unlock on an edge device like my iPhone ?
Please don't mention lower power consumption.
Silicon is way outside my wheelhouse, so genuine question: why not mention power consumption? In the data center, is this not one of the most important metrics?
For instance, GK104 on 28nm was 3.5 billion transistors. AD104 today is 35 billion. Is Nvidia really paying 10x as much for an AD104 die as a GK104 die?
If your "cost per transistor" calculation includes amortization of the fixed costs of taping out a chip, over the expected production volume, then you can sometimes genuinely end up with newer process nodes being more expensive. Design for more advanced nodes keeps getting more expensive, and mask sets keep getting more expensive. Even more so if you're pricing out a mature process node compared to early in the production ramp up of a leading edge node.
There's significant demand for older process nodes and we constantly see new chips designed for older nodes, and those companies are usually saving money by doing so (it's rare for a new chip to require such high production volumes that it couldn't be made with the production capacity of leading-edge fabs).
Intel and AMD have both been selling for years chiplet-based processors that mix old and newer fab processes, using older cheaper nodes to make the parts of the processor that see little benefit from the latest and greatest nodes (eg. IO controllers) while using the newer nodes only for the performance-critical CPU cores. (Numerous small chiplets vs one large chip also helps with yields, but we don't see as many designs doing lots of chiplets on the same node.)
What google turns up when I google this is this statement by google [1], which attributes the low point to 28nm (as of 2023)... and I tend to agree with the person you are responding to that that doesn't pass the sniff test...
My laptop definitely dies significantly faster when I'm making it work instead of just mindlessly scrolling on it... since the display is on in both cases I don't see what that could be but chip powre consumption making a singificant difference.
My phone dies much faster when I am using it, but admittedly screen usage means I can't prove that's chip power consumption.
VR headsets get noticeably hot in use, and I'm all but certain that that is largely chip power usage.
Macbook airs are the same speed as macbook pros until they thermally throttle, because the chips use too much power.
I've not checked it, but AFAIK power consumption isn't really improved much if at all with dye shrinks. The main benefits are entirely around transistor density increases which allows for things like bigger caches.
It'll be beneficial to DRAM chips, allowing for higher density memory. And it'll be beneficial to GPGPUs, allowing for more GPU processors in a package.
> The main benefits are entirely around transistor density increases which allows for things like bigger caches
SRAM is probably the the worst example as it scales poorly with process shrinks. There are tricks still left in the bag to deal with this, like GAA, but caches and SRAM cells are not the headline here. It's power and general transistor density.
If the marketing naming is to be believed, in 1.4nm vs 4nm you'd be able to fit ~twice the transistors in your chip. That's twice the cores, twice the cache... That usually makes it faster.
A 1.4nm chip offers significant performance and capability improvements over a 4nm chip, primarily due to increased transistor density. This allows for more powerful and efficient on-device AI processing, enabling new features and capabilities on devices like an iPhone without relying on cloud-based services
I find it amusing how we’ve come from treating AI as a novelty to developing a sense of how it writes in the space of a few months. That parent comment doesn’t even have the famed em dashes, for instance. Still, we are able to recognize it as AI-generated just by looking at its syntax.
For me it is the lack of content, the blandness of the statement. You can tell it is just saying vague statements that could be true if you substituted 14nm and 8nm for 4nm and 1.4nm.
More information on the new node:
> TSMC's A14 is brand-new process technology that is based on the company's 2nd Generation GAAFET nanosheet transistors and new standard cell architecture to enable performance, power, and scaling advantages. TSMC expects its A14 to deliver a 10% to 15% performance improvement at the same power and complexity, a 25% to 30% lower power consumption at the same frequency as well as transistor count, and 20% - 23% higher transistor density (for mixed chip design and logic, respectively), compared to N2. Since A14 is an all-new node, it will require new IPs, optimizations, and EDA software than N2P (which leverages N2 IP) as well as A16, which is N2P with backside power delivery.
https://www.tomshardware.com/tech-industry/tsmc-unveils-1-4n...
What about SRAM?
At this point, the transistors will scale smaller and the eventually the whole chip will just be sram by area.
We'll just have less SRAM per core. Maybe move to eDRAM for last-level caches where speed is not so important.
Kind of sad what's happened to US semiconductor manufacturing. Speaking from an American perspective, of course.
The US is trying to get fabrication out of Taiwan so that it doesn’t need to defend Taiwan from China.
If you were Taiwanese this would worry you?
It makes complete sense for Taiwan to invest in maintaining it’s “silicon shield” even as china tries to catch up with fabrication on the mainland.
Sure, being an essential part of the global supply chain for tech is important. It's also important to show support for Taiwan, to convince Japan, South Korea, etc from arming with nukes. That could set off a chain reaction in which everyone who is close says "f*ck it, I guess if everyone else is doing it..." The veneer of the US security umbrella folds and everyone suddenly feels they need to build (or retain) the bomb to protect themselves (like Ukraine failed to do in giving up theirs.) Now everyone needs MORE nukes because you have a LOT more targets.
NATO doesn't consider Ukraine as significant because they have vital tech they supply globally. Rather, NATO is concerned about an aggressive regional power that may have aims on more than just Ukraine.
For NATO maybe not, but for Europe and a fair chunk of the rest of the world Ukrainian grain exports are strategic.
If china succeeds in leapfrogging ASML. With their particle accelerator light source it likely won’t matter. They will have a home engineered piece of the solution.
> If china succeeds in leapfrogging ASML
they wont. full stop. they've even admitted as much in industry. they'll be extremely happy if they're (only!) a few years behind tsmc
at this point, what will be a real earthshaker is if china manages to get past 7nm. smic has gotten a long way but even that company's 7nm process has serious limitations (much higher cost and worse yields)
and anyways, aside from a handful of use cases (ai being one tbh), 7nm chips are more than viable for any general purpose task. a "leapfrog" is quickly diminishing from a "need" to a "want", and the resulting governmental support is fading as well. of course, it'll still be a high priority for the chinese, but it's not like top of the list.
You are two years behind in your assessment, and technology moves pretty fast especially when you're trying to catch up.
You could buy phones with SMIC 7nm chips as early as mid 2024, that means yields were good enough around mid 2023.
This indicates they'd be on track to do 5nm this year, which is what the news articles indicate. The impressive part is that this is catching up with ASML+TSMC combined. There's no other company or government in the world that has achieved this vertical in the last few decades. China is willing to sink Manhattan project level resources into this, for good reason.
Pretty sure they achieved that with DUV multipatterning, which isn't leapfrogging ASML at all.
If they keep pouring unlimited money at it and hiring the right people, I'd predict parity within two years.
Instead of using vague abstractions like "pouring unlimited money" and "hiring the right people", I'd hope that predictions would be predicated more on the actual specifics of the progress being made, i.e what are specific engineering problems to be overcome and what is their progress in doing so. If it's not, it's really just astrology one is using.
The Chinese perspective itself certainly isn't anything close to these vague abstractions, outside of vague anti-western polemics or nationalist chest-beating. After all with the same logic we're pouring a "manhattan's worth of funding" into "AI", dosen't mean we're going to be reaching Gen-AI anytime soon!
Industrial espionage is a thing. China has so far managed to get every tech they ever wanted and I don't see why EUV could not be stolen. Everything, even very advanced technology, can be reverse engineered, especially if you already know conceptually what it does. Software and data (and blueprints) can be stolen as well. ASML and TSMC have a lot of security in place but at least on HN I would assume everybody knows that that does not guarantee perfection. If the knowledge is out there, it will spread.
We’re assuming that the “silicon shield” is even a thing anymore.
China can comfortably make chips that might be the equivalent of 5 year old Taiwanese ones. Last time I checked, that’s extremely viable.
No military general ever is going to say, “we can’t invade, we’re half a decade behind!”
Isn’t the point that other countries would have to think twice before letting China get their hands on Taiwan? The advantage to China would be immense if they secured Taiwan.
There is another alternative that is much more likely: China gets to or at near parity and then no longer needs to get their hands on Taiwan. At that point they could just as easily destroy it as that they would want to occupy it. And that is a lot simpler. As long as Taiwan has an edge they are safer than when they are a commodity.
Kinda like Hong Kong when it came to the finance industry. Agree mostly. Instead of destroy, I would say China wouldn’t need to/care to maintain their fine balance/ special relationship with Taiwan anymore and would throw their weight around more.
You're assuming that China is afraid of "other countries", or that those countries can do anything to stop them. Truth is, China right now is challenging the US for global dominance, they are afraid of noone. They don't need anyone's permission to "get their hands on Taiwan".
No, I’m assuming that they’ll want to get their hands on Taiwan with as little mess as possible and keep the assets intact if possible. If their only choice was to start a global war, I don’t think they’ll do it regardless if they would win or not.
Well, that assumption might be wrong as well... Yes, taking control of the TSMC fabs is the best possible outcome for China, but destroying them is also not so bad. It limits the supply of bleeding edge semiconductors to the Western world, giving Huawei and other Chinese companies more time to catch up.
> giving Huawei and other Chinese companies more time to catch up.
Nothing screams “favorable market conditions” like WW3 and sanctions rivaling Russia.
WW3 would not be sanctions rivaling those Russia are under. WW3 would be zero trade, submarines sinking the shipping etc. What Russia is under is just sanctions, not real war measures.
What's the likelihood that any TSMC buildings are surviving though if there is an invasion
It seems to be a pretty well substantiated rumor that TSMC’s fabs are rigged to blow in case of attack.
I wouldn't say comfortably - they're brute forcing it by using UV sources suitable for much older nodes.
End result requires more energy, has lower yield and is overall more expensive.
For military purposes and whatnot that's enough, but they can't put this in consumer devices without subsidies.
This is true but it has been fascinating seeing them trying to catch up. Necessity being mother of invention, all of that.
It is still hard to get much of a clear picture on how well they are doing on this stuff. Chinese companies are saying they are near parity, the opposition says they are 15 years behind, the reality is probably somewhere in between.
While they have made a lot of quick progress, that is no guarantee for future gains.
My worry is that they're not bringing anything new to the table except "this is actually possible if you will it".
On the flipside Canon's Nanoimprint Lithography promises lower cost, even if their feature size is 15nm in practice. They also appear to be having competition now:
https://www.zyvexlabs.com/apm/atomically-precise-nano-imprin...
Recently Canon shipped the first commercial device and it appears that this path will see continued development. 15nm(14 advertised) is already good enough for, say, automotive applications.
MIC2025 made china manufacture over 90% of all semi domestically. They have replacements/clones of every jelly bean part imaginable. China can keep manufacturing like nothing happened if/when Taiwan is erased from the map.
They have a cost advantage. Kinda fine for where I'm standing; if we want to have more investment, we must liberalize migration! If we don't liberalize migration, necessarily the capital-labor ratio will be more capital-scarce in other countries.
I don't think it's easy to migrate to Taiwan (it's very unlikely that it's easier than migrating here, most East Asian countries make that difficult) so that doesn't seem to actually be a prerequisite.
We already have a pretty serious unemployment problem among college graduates so something else seems to be going on (a problem with domestic universities?)
Unemployment is actually a symptom of inelasticity(?) of wages. Some people become unemployed instead of getting a minimum wage job. Until Americans can subsist on a 1k monthly salary, the cost advantage in Taiwan is real.
"We already have a pretty serious unemployment problem among college graduates "
Among stem graduates?
I blame the American corporate meme. American corporations are hideously slow, lumbering and quite honestly many are just "too big to fail" prop ups at this point. Long gone are actual qualified individuals running even semiconductor manufacturers and its just bean counters and country club nephews.
Intel was killed by greedy investors that wanted profits plowed into buybacks instead of investing in the next fabs.
Pretty sure Intel failed because of the opposite, they were too ambitious in attempting to do both fab and design that they out outmanuvered by more nimble companies like AMD and TSMC.
American corporations are what created "Silicon Valley" in the first place. America is not slow, and it's definitely not "too big to fail" as the current administration is trying to make it fail, but that is an aside.
I think America doesn't manufacture semiconductors because it is a very unclean process, full of nasty chemicals. It's expensive to make semiconductors and deal with the clean-up. There are less environmental restrictions and cheaper labor in other parts of the world.
There are a bunch of Superfund sites around Mountain View, CA that serve as a reminder about the US Semiconductor industry - Fairchild Semiconductor, Intel, National Semiconductor, Monolithic Memories, and Raytheon to name a few.
Nobody in the U.S. really wants that in their back yard. Of course we've seen the same kind of thing from fracking, and everything else that rightly should be regulated or banned.
What happens now with a defunded and purposefully dysfunctional EPA is anyone's guess. Maybe manufacturers will exploit the political climate to further destroy the environment to make a few more million or billion dollars.
> I think America doesn't manufacture semiconductors because it is a very unclean process, full of nasty chemicals. It's expensive to make semiconductors and deal with the clean-up. There are less environmental restrictions and cheaper labor in other parts of the world.
> Nobody in the U.S. really wants that in their back yard.
Disagree. I worked in Intel's flagship semiconductor r&d division in Oregon in the 2010s.
Everybody wanted Intel in their backyard. It was a huge source of high paying and stable jobs, both for engineers with PhDs, like me, and for thousands of technicians and support staff.
There were protected wetlands on Intel's campuses, and parks and fields around them. Certainly Intel wasn't perfect from an environmental point of view, but it was not a high source of pollution.
I've worked at/with many other semiconductor fabs around the US and around the world, and they're mostly similar in this regard. Far "cleaner" than factories in many other industries.
> American corporations are what created "Silicon Valley" in the first place.
According to https://steveblank.com/2009/04/27/the-secret-history-of-sili... and https://www.youtube.com/watch?v=ZTC_RxWN_xo the creation of Silicon Valley had more to do with academic expertise in radio research and Department of Defense funding circa World War II. Corporations were the "second wave".
“Silicon Valley” is more than just some of the post-war defense research and high tech weapons contracts. It includes the financing/fundraising, the talent pool, nearby university research, and the advantages of “fail fast” startup culture (including California’s jurisprudence). It didn’t become recognizable in its current form until after NASA’s Moon Shot project threw tons of funding at research and manufacturing of computer miniaturization.
This. There are a few areas in the US that have a long history of being incubators of engineering firms beyond Silicon Valley, all because of WW2 which had tons of money being spent to produce certain types of equipment in each area.
And of course, this is before the mega defense contractors that exist now. The military absolutely fucking hates those megacorps and does still try and actively fund new small business entrants to military contracting. The problem is mega corps buy them up as the US is owned by them.
>"The popularization of the name is often credited to Don Hoefler, the first journalist to use the term in a news story.[1] His article "Silicon Valley U.S.A." was published in the January 11, 1971"
https://en.wikipedia.org/wiki/Silicon_Valley
"Silicon Valley" describes the period between the late 1960s and mid/late 1990s (and still to this day to some extent). It has nothing to do with what went on there around World War II. Yes, semiconductor corporations created "Silicon Valley".
Before that time it may have been a sort of "Vacuum Tube Valley", but that does not have the same ring to it. And around WW2 there was tech going on everywhere, not just around Mountain View.
Tell me you didn't read or watch the linked references without saying so.
I skimmed it pretty quickly, but it doesn't change the fact that nobody called it "Silicon Valley" until 1971. The article you sent me was about WW2, and military, so far as I could tell. Reading it wouldn't change anything about my statements.
It literally tracks the histories of the individuals who founded all the corporations people think of as belonging to "Silicon Valley". Things tend to exist for a while before they get a widely recognizable name, friend.
Silicon Valley as people think of it today (tech, talent, and capital) is generally considered to have come together in the early 1970s. That doesn't mean the ingredients weren't there before -- tech had been around for decades already, and the laws allowing free movement of talent go back to the late 1800s -- but by most accounts the early 1970s are when it all clicked. Note that there's no fact of the matter to be right or wrong about here, though.
If I had to pick a point in time as the beginning, I'd probably put it at the founding of either Kleiner Perkins or Intel (a couple years after or before the Silicon Valley moniker was coined, respectively). Before then funding mostly came from other companies. With Intel you have successful founders funding their own new company, and with Kleiner Perkins you have successful founders funding other founders. To me it isn't Silicon Valley until this dynamic emerges.
Thanks for letting me know you didn't read the article, and adding nothing to what is already said within it. If you go back and read it now, you may learn some things about how history differs from what is "generally considered" as you put it. Have fun!
I've seen Steve's talk. Like all historical accounts it's just a story. It pulls some details into the foreground and pushes the rest back. Other stories arrange the details differently, for example marking Silicon Valley's beginning quite a bit earlier, with the founding of HP, a decade before the Department of Defense existed. Steve's version isn't some transcendental truth, and people aren't wrong to disagree with it or with you.
> Like all historical accounts it's just a story.
Narrative and fact are two distinct aspects of history which work together. Portraying the heavily referenced and fact-laden linked article / talk as "just story" borders on dishonesty by intentionally ignoring the facts presented - the most interesting part. Who, What, When, Where, Why, and How.
> Steve's version isn't some transcendental truth
I don't see anywhere I make such a claim. "Silicon Valley" is a narrative. My point has been that the facts paint a deeper and more complex history than that narrative provides. Have a nice day!
GlobalFoundries has 3, Micron has 2, Intel has 4, Texas Instruments has 7, TSMC has 3, and Samsung has 1 in the U.S., etc.
A simple Google search for 'how many semiconductor fabs are in the U.S.' shows there are 70 commercial fabs.
And they all have to deal with far more regulations than a fab in Taiwan, and thus cost far more.
TSMC's competitive advantage comes from Taiwan's unique willingness to look away from wanton dumping of used acid wash like it's the 80s in Silicon Valley? Or moderately more expensive labor on one of those highly automated factories with FOUPs zooming every which way?
Press (X) to doubt.
Quite a bit of "Silicon Valley" was founded on outright theft from competitors. Now that the American industry is entrenched and "protect intellectual property" dominates over "improvement", falling behind other nations is inevitable.
One of the risks of any belief in American exceptionalism is that it hides the reality that there’s nothing special about America to have deserved its position in industry and commerce. There’s no special reason why it might not soon be someone else’s turn.
I don't really follow how your statement relates to o11c's. The theft they are referring to is of other American companies - not other countries' IP.
The military industrial complex, endowment funds from large colleges and academic research created Silicon Valley.
American corporations are fading into irrelevance through "financial management". Manufacturing powerhouses like Boeing and Intel are a shadow of their former selve and are really just coasting on inertia.
Pretty much everywhere you see "innovation", you will see government money. Look at the pharma industry. I doubt there's a drug out there that wasn't created by researchers using federal grants.
I'm often reminded of the story of Tetris. A handful of Soviets created the game. What was capitalism's contribution? Licensing agreements, sub-licensing agreements and so on. Put another way: building enclosures. That and rent-seeking is really all American corporations do anymore.
anddd, just like any other Western Europesn country Americans need to be paid (semi-) living wages
Are you arguing that engineers inadequate salaries are to blame for intel losing its edge?
Software engineer salaries in the US are significantly higher than electrical and computer engineering salaries and have been for a whole. Most of the bright and ambitious EE and CE people went to faang companies in 2010s and probably earlier too.
Yes, this is true.
I had a great job in R&D at Intel, in a department full of PhDs, in the 2010s, then jumped to another semi company from 2015-20.
Just before the pandemic in 2020, I got a job at AWS as a software engineer. It wasn't the only reason, but it was clear that I could make a lot more money in software.
I quickly became disillusioned with almost everything about how large software companies work, and now back working as a data scientist for an advanced manufacturing company 5 years later.
A big reason why TSMC is competitive on the global market is precisely that their wages are low.. Granted, they don't have a finance industry or big tech to draw away talent.
How close are we to the limits here? What is the smallest technology we can get to before physics gets in the way?
This is effective feature size and has little to do with actual geometry. Transistor size has barely budged in the last 10–15 years. The limitation is electrical, and it's not clear where that limit is. The smallest gate was built with an AFM out ~7 atoms; that's about 8 orders of magnitude smaller than a transistor, rn, and upwards of 9 than a stdcell. There's a LOT of room; we just don't know a good path to get to there.
"The smallest gate was built with an AFM out ~7 atoms; that's about 8 orders of magnitude smaller than a transistor"
I was thrown off by your statement, so here are some numbers: a modern chip like Nvidia's GH100 manufactured at a 5 nm process is 80 billion gates in 814 mm². That means a gate is 100 nm wide which is the width of 500 silicon atoms. On a 2D area that's 250k atoms. I don't know the thickness but assuming it's also 500 atoms then a gate has a volume of 125 million atoms.
So I guess you get your "8 orders of magnitude" difference if you compare the three-dimensional volume (7 atoms vs 125 million). But on one dimension it's only 2 orders of magnitude (7 atoms vs 500). And the semiconductor industry measures progress on one dimension so to me the "2 orders of magnitude" seems the more relevant comparison to make.
You're missing the key point, which is that the size referenced as the semiconductor manufacturing node is no longer an accurate description of the true transistor size, it's more of a marketing term.
Even if it's possible to build transistors that are 1.4nm in size (or smaller), that is not what "1.4nm" means in the context of this announcement. I get that this can be confusing, it's just a case of smoke and mirrors because Moore's Law is already dead and semiconductor manufacturers don't want to spook investors. The performance gains are still real, but the reasons for getting them are no longer as simple as shrinking transistor size.
As for the true physical limits of transistor sizes, there are problems like quantum tunnelling that we aren't likely to overcome, so even if you can build a gate with 7 atoms, that doesn't mean it'll work effectively. Also note that "gate" does not necessarily mean "transistor".
I know and acknowledged it: GH100 gates are 100nm wide despite the "5nm" process. We all know about this discrepancy.
"Moore's Law is already dead"
That's clearly false, have a quick look at this chart: https://semiconductor.substack.com/p/the-relentless-pursuit-...
> That's clearly false, have a quick look at this chart
Feels like it's not detailed enough to make an assessment.
For example, if die size is being increased to counter the lack of improvements from transistor shrinking, it may technically meet the criteria set out in Moore's Law, but not in the sense that most people use it as a yard stick for performance improvements.
Even if it is a part, you're ignoring that it is still a difficult technical challenge. It it weren't we'd see much larger dies. We have a lot of infrastructure around these chips that we're stacking together and the biggest problem in supercomputing and data centers is I/O. People would love to have bigger chips. I mean that's a far better solution than a dual socket mobo.
> This definitely doesn't fully explain everything that is happening. Die sizes aren't changing that fast.
Let's look at a real world comparison.
Based on information I can find online...
* Apple M1 silicon die was 118.91mm2, used TSMC 5nm process node, and had 16 billion transistors.
* Apple M3 silicon die was 168mm2, used TSMC 3nm process node, and had 25 billion transistors.
If you compare these two, you can see that the increased die size did allow for most of the improvements in transistor count. Even if it's not a completely like-for-like comparison, and is not necessarily always as straightforward as this, it's obvious to me that transistor count on it's own is a poor measure of semiconductor process improvements, and a much better measure is transistor density (e.g. comparing how many transistors can fit into a wafer of a fixed size, such as 100mm2).
I'm going to refer you to my other comment, because you did the same thing[0]
Second, let's actually check the math. Just divide the two numbers
So no, the increased die size did not allow for that. If we had the same ratio the M3 would have 22.7 billion transistors. (22.7-16)/(25-16) = 74.4%. I'll give you that this is "most" but 25% is not a rounding error. You gave yourself wiggle room, but that wiggle room is exactly the thing I was talking about and what you're arguing against.Don't just cherry pick things from what people said and stroke your ego. Sometimes other people aren't completely clueless.
[0] https://news.ycombinator.com/item?id=44628144
They also demonstrated knowledge of the very thing you're trying to explain to them. So if you think they are being rude or hostile to you, be aware that, even if unintentionally, you insulted them first. To be honest, even I feel a little offended because I don't know how you read that comment and come away thinking they don't know that 1) the 'x'nm number is not representative, 2) gains are coming from things other than literal physical size, 3) quantum tunneling. They actively demonstrated knowledge of the first two and enough that I suspect they understand the third. I mean it is a complex phenomena that few people understand in depth, but it is also pretty well known and many understand at a very high level.
From a third party perspective: it comes off like you didn't read their comment. I think you're well intentioned, but stepped on a landmine.
> Moore's law is dead
People have said this for decades. Jim Keller believes otherwise and brought receipts: https://www.youtube.com/watch?v=oIG9ztQw2Gc
Personally I wouldn't argue Moore's law is dead but I do wonder about the cost per chip. WE can still push smaller for a lot longer but the costs are starting to creep up a bit.
I am not too worried though because adjusted for inflation, we are still saying a lot less for this tech than we were in pre-2000's tech.
Economy of scale used to drop prices, complexity of manufacturing will slowly increase them again.
Similar concerns were expressed around the development of VLSI tooling in the '70s. We've been on this curve for a long time. I can speak with some experience that manufacturing of steel and wood products increases similarly in cost and complexity with scale. Beasts like these exist: https://en.wikipedia.org/wiki/Heavy_Press_Program
As long as the scale of production is increasing, additional investments will be warranted.
> You're missing the key point
I think they understand the space well.
What mrb is pointing out is that OP is comparing two different units. The 7 atoms is a count of atoms in a 3d space and is not size dimension. Comparing a count of atoms with physical size of a transistor is problematic.
So, it's more of an engineering problem than a physical one? I read somewhere a while ago about strange quantum effects activating at these scales too. What's the current state beyond 1.4 nm with our current knowledge?
Yes. But also what you read is correct too. There are quantum effects that need be accounted for at these levels (have been for quite some time). Both things can be true.
(in Taiwan)
> (in Taiwan)
But also:
At the TSMC second-quarter earnings conference and conference call on Thursday, TSMC chairman C.C. Wei (魏哲家) said that after the completion of the company’s US$165 billion investment in the US, “about 30 percent of our 2-nanometer and more advanced capacity will be located in Arizona, creating an independent leading-edge semiconductor manufacturing cluster in the US.”
The Arizona investment includes six advanced wafer manufacturing fabs, two advanced packaging fabs and a major research and development center.
Hey, how much water would that infrastructure need, possibly?
Isn't this water nearly 100% recyclable? It's not that it would get used up, like water used for watering of almond trees in California.
I mean, it could be - the highly filtered water could be re-filtered.
But unless it's cheaper to do so, or they're required by law to do so, they're just going to pump cleaner starting water out of the drinking supply and use that.
And good luck finding a city or state government that's not so desperate for big industry and tech jobs to arrive that they will hold their feet to the fire and demand they cut water use.
https://news.asu.edu/20240925-science-and-technology-why-chi...
why not do a simple google search?
There was a story about this a year ago: https://fortune.com/2024/04/08/tsmc-water-usage-phoenix-chip...
As TSMC and Taiwang government policy, they always build it first in Taiwan, run for some years and then build in the US. They keep Taiwan relevant and protected this way.
Geopolitics aside, is this not just good business sense given the accepted labor practices and talent pool in Taiwan vs. other countries?
Yeah, who wouldn't invest locally first when there is an economic advantage to doing so? Their suppliers, talent base, and management are all there already.
I don’t think you can separate geopolitics from business in this case.
The hint is in the company’s name. ;)
TSMC building outside of Taiwan is a big deal these days: https://en.wikipedia.org/wiki/TSMC#Arizona https://en.wikipedia.org/wiki/TSMC#Washington https://en.wikipedia.org/wiki/TSMC#Japan https://en.wikipedia.org/wiki/TSMC#Germany
From the article:
.. so it's interesting that they are moving forward with domestic 1.4nm given the geopolitical climate.> The hint is in the company’s name. ;)
They might build factories outside Taiwan you never know.
Of course. And were that the actual case, it would be worth having in the summary.
The chips we need for the machines that will defend Taiwan are being built in Taiwan is just a ridiculous game of chicken to be setup.
I wish they’d take the next step with the defense treaty to move even more capacity (esp for the highest grade stuff) to stateside.
Most of the defense tech is not using bleeding edge N2-N7 nodes.
I wonder if they see reduced geopolitical risk or if they simply must continue to operate as if nothing is going to happen until something happens.
TSMC announced new fabs in the US earlier this year. They need new fabs in Taiwan so nobody gets any ideas that TSMC could continue operations without a free Taiwan. Keeping Taiwan indispensable to the US is how they keep Chinese invasion plans in the planning state
Why would a free taiwan be necessary? I don’t think there ccp would have any qualms about tsmc continuing operation. A chinese company being the indisputed best at the modt advanced industry in the world is a good thing for them. Assuming a bloodless takeover occurred it would be business as usual.
The whole system that supports TSMC will break down in the event of a war.
You can see this with SMIC and their inability to get modern lithography systems from the only leading edge vendor ASML. Sure, you can create your own vendors to replace such companies, but they are unlikely to ever catch up to the leading edge or even be only a generation or two behind the leading edge despite massive investments.
With non-leading edge equipment & processes you have to make compromises like making much larger chips so you can get the same compute in a low power profile. This drives up the initial cost of every device you make and you run into throughput issues like what Huawei has experienced where they cannot produce enough ships to ship their flagship ship phones at a reasonable price and simultaneously keep them in stock.
Instead you get boutique products that sell out practically immediately because there were so few units that were able to be manufactured.
"Bloodless takeover" is assuming a lot. Pro unification is a very fringe position: https://en.m.wikipedia.org/wiki/Chinese_Unification_Promotio...
It seems very unlikely to me that between KMT loyalist troops and angry mobs that China would simply be allowed to take Taiwan without violence, and that nobody would decide to use TSMC as a hostage.
See the Swiss strategy, where every bridge and tunnel has its explosives pre-placed when it was built.
I am not assuming a bloodless takeover, i am saying that assuming that happened as a best case scenario where the fabs are intact.
All they need to do, is open a fire exit, and run a leafblower.
Fabs run at BSL3. Get that dirty, and you have a whole lot of expensive scrap metal.
Fabs can and do recover from major contamination events. In 2021, Renesas suffered a fire that destroyed about 5% of their N3 building. It took them just under a month to resume production and just over 3 months to reach pre-incident production levels. Fab decontamination is a major task, but ultimately it's just a very, very thorough cleaning process.
Incidentally, they don't operate at BSL3 - that's a standard for biosafety that has more to do with protecting the outside world from the lab rather than vice-versa. Fabs operate in accordance with ISO 14644.
Thanks for the extra info.
I used to work for a company that made steppers.
Pretty hairy stuff.
And you are correct. I have found “BSL3” conjures up the most appropriate images, though.
The implication I got from the GP comment is that the U.S. would be reluctant to have CCP manufacturing the processors due to the (proven) risk that they’ll modify and backdoor stuff.
If TSMC over invests in US factories then they could be taken over under imminent domain if Taiwan was no longer independent. So they have to keep a large portion of manufacturing domestic to Taiwan for lessened geopolitical risk.
In that case almost any country would let their borders wide open for refugee visas to get the semiconductor talent over. even the us under trump.
You underestimate how gobsmackingly dumb this administration is, IMO. They've cancelled extremely important, multi-year or -decade long clinical trials just for funsies.
Do you mean ‘bloodless’ like the way the CCP controls dissidents now?
There are rumors their fabs are rigged to self destruct rather than fall to china
https://www.theregister.com/AMP/2024/05/21/asml_kill_switch/
The best thing to do is become as valuable to the USA as possible
By the time the factories are completed, Trump will likely have changed his mind about the tariffs a dozen times. Just move along..
Thanks to all the investment due to AI we have been able to continue these improvement at the current rate. To put things into perspective an Apple 1.4nm A18 Pro ( 3nm ) would only use ~50% of the energy with the same performance.
I am hoping we have more to squeeze out from an IPC or PPA ( Performance per area ) metric. ARM seems to be in the lead in this area. Wondering if Apple will have something to show in A19 / M5.
NAND and DRAM side is a bit more worrying though. Nothing in the pipeline suggest some dramatic changes.
Edit: Not sure why I am getting downvoted every time I say it is AI investment leading to improvements. I guess some on HN really hate AI.
What advantage will a 1.4nm chip have over a 4nm one? What new capabilities will this tech unlock on an edge device like my iPhone ? Please don't mention lower power consumption.
> Please don't mention lower power consumption.
Silicon is way outside my wheelhouse, so genuine question: why not mention power consumption? In the data center, is this not one of the most important metrics?
It is even more important in portable battery powered devices.
> Please don't mention lower power consumption.
How about "longer battery life".
Also "lower cost".
Or sacrificing those on the alter of more compute running more complex things.
Cost per transistor stopped going down awhile ago
Can this be right?
For instance, GK104 on 28nm was 3.5 billion transistors. AD104 today is 35 billion. Is Nvidia really paying 10x as much for an AD104 die as a GK104 die?
If your "cost per transistor" calculation includes amortization of the fixed costs of taping out a chip, over the expected production volume, then you can sometimes genuinely end up with newer process nodes being more expensive. Design for more advanced nodes keeps getting more expensive, and mask sets keep getting more expensive. Even more so if you're pricing out a mature process node compared to early in the production ramp up of a leading edge node.
There's significant demand for older process nodes and we constantly see new chips designed for older nodes, and those companies are usually saving money by doing so (it's rare for a new chip to require such high production volumes that it couldn't be made with the production capacity of leading-edge fabs).
Intel and AMD have both been selling for years chiplet-based processors that mix old and newer fab processes, using older cheaper nodes to make the parts of the processor that see little benefit from the latest and greatest nodes (eg. IO controllers) while using the newer nodes only for the performance-critical CPU cores. (Numerous small chiplets vs one large chip also helps with yields, but we don't see as many designs doing lots of chiplets on the same node.)
28nm was over a decade ago. Cost scaling stopped around 2021
Do you have a citation for this?
What google turns up when I google this is this statement by google [1], which attributes the low point to 28nm (as of 2023)... and I tend to agree with the person you are responding to that that doesn't pass the sniff test...
[1] https://www.semiconductor-digest.com/moores-law-indeed-stopp...
Lower power consumption makes almost no difference at the consumer tier.
My laptop definitely dies significantly faster when I'm making it work instead of just mindlessly scrolling on it... since the display is on in both cases I don't see what that could be but chip powre consumption making a singificant difference.
My phone dies much faster when I am using it, but admittedly screen usage means I can't prove that's chip power consumption.
VR headsets get noticeably hot in use, and I'm all but certain that that is largely chip power usage.
Macbook airs are the same speed as macbook pros until they thermally throttle, because the chips use too much power.
This claim just doesn't pass the smell test.
It might be niche, but I just got a new computer for this very reason.
Why wouldn’t you want lower power usage?
I've not checked it, but AFAIK power consumption isn't really improved much if at all with dye shrinks. The main benefits are entirely around transistor density increases which allows for things like bigger caches.
It'll be beneficial to DRAM chips, allowing for higher density memory. And it'll be beneficial to GPGPUs, allowing for more GPU processors in a package.
> The main benefits are entirely around transistor density increases which allows for things like bigger caches
SRAM is probably the the worst example as it scales poorly with process shrinks. There are tricks still left in the bag to deal with this, like GAA, but caches and SRAM cells are not the headline here. It's power and general transistor density.
Lower heat production.
Lower power consumption is always relevant for portable devices. 1.4nm will have many more transistors per mm^2 which should improve performance.
If the marketing naming is to be believed, in 1.4nm vs 4nm you'd be able to fit ~twice the transistors in your chip. That's twice the cores, twice the cache... That usually makes it faster.
If the marketing name is to believed... and we assume both dimensions scale the same... (4/1.4)^2 = 8.16x the transistors.
Facebook 2
More chips per wafer.
For iPhone, not much. It already has a ridiculously powerful CPU. SWEs can continue writing ridiculously inefficient code.
For data centers, it will help a lot. More compute for same power.
A 1.4nm chip offers significant performance and capability improvements over a 4nm chip, primarily due to increased transistor density. This allows for more powerful and efficient on-device AI processing, enabling new features and capabilities on devices like an iPhone without relying on cloud-based services
But at the same time, the cost of manufacturing may increase. But I have no data on this, it's just a guess.
It will increase, but amortization tends to make that fall off over time. Also the newer processes tend to result in smaller die sizes.
Production of anything on a new line is expensive, doesn't matter if it is chips or cheeze-its
But you also get more transistors per wafer
Depends on your yield, actually :( You get more transistors per square mm.
Is this chatGPT? Just want to check on a hunch.
I find it amusing how we’ve come from treating AI as a novelty to developing a sense of how it writes in the space of a few months. That parent comment doesn’t even have the famed em dashes, for instance. Still, we are able to recognize it as AI-generated just by looking at its syntax.
For me it is the lack of content, the blandness of the statement. You can tell it is just saying vague statements that could be true if you substituted 14nm and 8nm for 4nm and 1.4nm.
Looks like it's also his first post in a year. Feels full AI generated.