December 3rd, 2024

Intel announces Arc B-series "Battlemage" discrete graphics with Linux support

Intel announced its Arc B-Series "Battlemage" graphics cards, featuring the B580 and B570 models with improved performance. Prices are $249 and $219, shipping next week and January 16, respectively.

Read original articleLink Icon
CuriositySkepticismDisappointment
Intel announces Arc B-series "Battlemage" discrete graphics with Linux support

Intel has announced its next-generation Arc B-Series "Battlemage" discrete graphics cards, succeeding the two-year-old Alchemist series. The new Battlemage cards, which include the B580 and B570 models, feature significant improvements in performance and efficiency. The B580 is equipped with 20 Xe cores, a 2670MHz graphics clock, and 12GB of GDDR6 memory, while the B570 has 18 Xe cores, a 2500MHz clock, and 10GB of memory. Intel claims up to a 70% performance increase per Xe core and a 50% improvement in performance per watt compared to the previous generation. Both models support open-source graphics drivers on Linux, with the B580 priced at $249 and the B570 at $219, set to ship next week and on January 16, respectively. The cards are designed for mid-range gaming, particularly at 1440p resolution. They utilize a PCIe 4.0 x8 interface and require an 8-pin power connector. While initial benchmarks show the B580 outperforming the Arc A750 by 24% and competing favorably against NVIDIA's RTX 4060, detailed performance metrics and Linux support specifics will be available after the review embargo lifts.

- Intel's Arc B-Series "Battlemage" graphics cards have been announced, succeeding the Alchemist series.

- The B580 and B570 models feature significant performance and efficiency improvements.

- Both models support open-source graphics drivers on Linux.

- The B580 is priced at $249 and the B570 at $219, with shipping dates set for next week and January 16, respectively.

- Initial benchmarks indicate the B580 outperforms the Arc A750 and competes well against NVIDIA's RTX 4060.

AI: What people are saying
The announcement of Intel's Arc B-Series "Battlemage" graphics cards has generated a variety of reactions among commenters.
  • Many users express disappointment over the limited VRAM (12GB), suggesting it is insufficient for modern gaming and machine learning applications.
  • There is a strong interest in the cards' performance, particularly in relation to transcoding and Linux support, with some users eager to test their capabilities.
  • Concerns about Intel's driver support and overall reliability compared to Nvidia persist, with some users sharing past experiences with Intel's graphics cards.
  • Commenters are curious about the target audience for these GPUs, questioning whether they can compete effectively in the current market.
  • Several users highlight the aggressive pricing strategy of Intel, hoping it will lead to better options in the budget segment.
Link Icon 53 comments
By @Night_Thastus - 4 months
We'll have to wait for first-party benchmarks, but they seem decent so far. A 4060 equivalent $200-$250 isn't bad at all. for I'm curious if we'll get a B750 or B770 and how they'll perform.

At the very least, it's nice to have some decent BUDGET cards now. The ~$200 segment has been totally dead for years. I have a feeling Intel is losing a fair chunk of $ on each card though, just to enter the market.

By @declan_roberts - 4 months
I think a graphics card tailored for 2k gaming is actually great. 2k really is the goldilocks zone between 4k and 1080p graphics before you start creeping into diminishing returns.
By @rmm - 4 months
I put an a360 Card into an old machine I turned into a plex server. It turned it into a transcoding powerhouse. I can do multiple indepdent streams now without it skipping a beat. Price-performance ratio was off the chart
By @jmward01 - 4 months
12GB max is a non-starter for ML work now. Why not come out with a reasonably priced 24gb card even if it isn't the fastest and target it at the ML dev world? Am I missing something here?
By @Implicated - 4 months
12GB memory

-.-

I feel like _anyone_ who can pump out GPU's with 24GB+ of memory that are capable to use for py-stuff would benefit greatly.

Even if it's not as performant as the NVIDIA options - just to be able to get the models to run, at whatever speed.

They would fly off the shelves.

By @Havoc - 4 months
Who is the target audience for this?

Well informed gamers know Intel's discrete GPU is hanging by a thread, so they're not hoping on that bandwagon.

Too small for ML.

The only people really happy seem to be the ones buying it for transcoding and I can't imagine there is a huge market of people going "I need to go buy a card for AV1 encoding".

By @jmclnx - 4 months
>Battlemage is still treated to fully open-source graphics driver support on Linux.

I am hoping these are open in such a manner that they can be used in OpenBSD. Right now I avoid all hardware with a Nvidia GPU. That makes for somewhat slim pickings.

If the firmware is acceptable to the OpenBSD folks, then I will happly use these.

By @rbanffy - 4 months
For me, the most important feature is Linux support. Even if I'm not a gamer, I might want to use the GPU for compute and buggy proprietary drivers are much more than just an inconvenience.
By @Scene_Cast2 - 4 months
I wonder how many transistors it has and what the chip size it is.

For power, it's 190W compared to 4060's 115 W.

EDIT: from [1]: B580 has 21.7 billion transistors at 406 mm² die area, compared to 4060's 18.9 billion and 146 mm². That's a big die.

[1] https://www.techpowerup.com/gpu-specs/arc-b580.c4244

By @confident_inept - 4 months
I'm really curious to see if these still rely heavily on resizable BAR. Putting these in old computers in linux without reBAR support makes the driver crash with literally any load rendering the cards completely unusable.

It's a real shame, the single slot a380 is a great performance for price light gaming and general use card for small machines.

By @ChrisArchitect - 4 months
By @bjoli - 4 months
I love my a750. Works fantastic out of the box in Linux. He encoding and decoding for every format I use. Flawless support for different screens.

I haven't regretted the purchase at all.

By @karmakaze - 4 months
I wanted to have alternative choices than Nvidia for high power GPUs. Then the more I thought about it, the more it made sense to rent cloud services for AI/ML workloads and lesser powered ones for gaming. The only use cases I could come up with for wanting high-end cards are 4k gaming (a luxury I can't justify for infrequent use) or for PC VR which may still be valid if/when a decent OLED (or mini-OLED) headset is available--the Sony PSVR2 with PC adapter is pretty close. The Bigscreen Beyond is also a milestone/benchmark.
By @CoastalCoder - 4 months
Given Intel's recent troubles, I'm trying to decide how risky it is to invest in their platform. Especially discrete GPUs for Linux gaming

Fortunately, having their Linux drivers be (mostly?) open source makes a purchase seem less risky.

By @ThatMedicIsASpy - 4 months
SR-IOV is supported on their iGPUs and outside of it exclusive to their enterprise offering. Give it to me on desktop and I'll buy.
By @bloodyplonker22 - 4 months
I wanted Intel to do well so I purchased an ARC card. The problem is not the hardware. For some games, it worked fine, but in others, it kept crashing left and right. After updates to drivers, crashing was reduced, but it still happened. Driver software is not easy to develop thoroughly. Even AMD had problems when compared to Nvidia when AMD really started to enter the GPU game after buying ATI. AMD has long since solved their driver woes, but years after ARC's launch, Intel still has not.
By @crowcroft - 4 months
Anyone using Intel graphics cards? Aside from specs drivers and support can make or break the value prop of a gfx card. Would be curious what actually using is these is like.
By @bigiain - 4 months
"Arc B"?

Presumably graphics cards optimised for hairdressers and telephone sanitisers?

By @greenavocado - 4 months
I'm not a gamer and there is not enough memory in this thing for me to care to use it for AI applications so that leaves just one thing I care about: hardware accelerated video encoding and decoding. Let's see some performance metrics both in speed and visual quality
By @Venn1 - 4 months
I’ll pick up a B580 to see how it works with Jellyfin transcoding, OBS streaming using AV1, and, with some luck, Davinci Resolve. Maybe a little Blender?

Other exciting tests will include things like fan control, since that’s still an issue with Arc GPUs.

Should make for a fun blog post.

By @mtlmtlmtlmtl - 4 months
Bit disappointed there's no 16gig(or more) version. But absolutely thrilled the rumours of Intel discrete graphics' demise were wildly exaggerated(looking at you, Moore's Law is Dead...).

Very happy with my A770. Godsend for people like me who want plenty VRAM to play with neural nets, but don't have the money for workstation GPUs or massively overpriced Nvidia flagships. Works painlessly with linux, gaming performance is fine, price was the first time I haven't felt fleeced buying a GPU in many years. Not having CUDA does lead to some friction, but I think nVidia's CUDA moat is a temporary situation.

Prolly sit this one out unless they release another SKU with 16G or more ram. But if Intel survives long enough to release Celestial, I'll happily buy one.

By @jvanderbot - 4 months
Seems to feature ray tracing (kind of obvious), but also upscaling.

My experience on WH40K DT has taught me that upscaling is absolutely vital for a reasonable experience on some games.

By @andrewstuart - 4 months
Intel can't compete head to head with Nvidia on performance.

But surely it's easy enough to compete on video ram - why not load their GPUs to the max with video ram?

And also video encoder cores - Intel has a great video encoder core and these vary little across high end to low end GPUs - so they could make it a standout feature to have, for example, 8 video encoder cores instead of 2.

It's no wonder Nvidia is the king because AMD and Intel just don't seem willing to fight.

By @tommica - 4 months
Probably would jump to Intel once my 3060 gets too old
By @s17tnet - 4 months
What about sharing GPU across multiple VMs? Isn't Nvidia walled this feature behind unreasonably high price features?
By @stracer - 4 months
Too late, and it has a bad rep. This effort from Intel to sell discrete GPUs is just inertia from old aspirations, won't really help noticeably to save it, as there is not much money in it. Most probably the whole Intel ARC effort will be mothballed, and probably many more will.
By @999900000999 - 4 months
I actually really like the Arc 770.

However, this is going to go on clearance within 6 months. Good for consumers, bad for Intel.

Also keep in mind for any ML task Nvidia has the best ecosystem around. AMD and Intel are both like 5 years behind to be charitable...

By @Archit3ch - 4 months
They say the best predictor for the future is the past.

How was driver support for their A-series?

By @gs17 - 4 months
> Intel with their Windows benchmarks are promoting the Arc B580 as being 24% faster than the Intel Arc A750

Not a huge fan of the numbering system they've used. B > A doesn't parse as easily as 5xxx > 4xxx to me.

By @zenethian - 4 months
These are pretty interesting, but I'm curious about the side-by-side screenshot with the slider: why does ray tracing need to be enabled to see the yellow stoplight? That seems like a weird oversight.
By @mushufasa - 4 months
what's the current status of using cuda on non-gpu chips?

IIRC that was one of the original goals of geohot's tinybox project, though I'm not sure exactly where that evolved

By @hx8 - 4 months
I like Intel's aggressive pricing against entry/mid level GPUs, which hopefully puts downward pressure on all GPUs. Overall, their biggest concern is software support. We've had reports of certain DX11/12 games failing to run properly on Proton, and the actual performance of the A series varied greatly between games even on Windows. I suspect we'll see the same issues when the B580 gets proper third party benchmarking.

Their dedication to Linux Support, combined with their good pricing makes this a potential buy for me in future versions. To be frank, I won't be replacing my 7900 XTX with this. Intel needs to provide more raw power in their cards and third parties need to improve their software support before this captures my business.

By @senectus1 - 4 months
Typical, they release this on the day I pick up a brand new A770 16gb to toy with LLM stuff.

ah well. pretty sure it'll do for my needs.

By @BadHumans - 4 months
I'm considering getting one to replace my 8 year old NVIDIA card but why are there 2 SKUs almost identical in price?
By @imbusy111 - 4 months
None of the store links work. Weird. Is this not supposed to be a public page yet?
By @brenainn - 4 months
From the gaming side of things, I'm disappointed that Intel and AMD are focusing on the midrange market going forwards. I'm on Linux with a 6900XT and wasn't going to upgrade until there's a compatible option with acceptable raytracing performance (and when HDR is finally sorted out). The 4090 and other high tier cards are absurdly expensive, would be good to have competition in that segment.
By @kookamamie - 4 months
Why, though? Intel's strategy seems puzzling, to say the least.
By @maxfurman - 4 months
How does this connect to Gelsinger's retirement, announced yesterday? The comments on that news were all doom and gloom, so I had expected more negative news today. Not a product launch. But I'm just some guy on HN, what do I know?
By @SeqDesign - 4 months
the new intel battlemage cards look sweet. if they can extend displays on linux, then i'll definitely be buying one
By @smcleod - 4 months
12GB of vRAM? What a wasted opportunity.
By @lhl - 4 months
Recently I did some testing of the IPEX-LLM llama.cpp backend on LNL's Xe2: https://www.reddit.com/r/LocalLLaMA/comments/1gheslj/testing...

Based on scaling by XMX/engine clock napkin math, the B580 should have 230 FP16 TFLOPS and 456 GB/s MBW theoretical. At similar efficiency to LNL Xe2, that should be about pp512 ~4700 t/s and tg128 ~77 t/s for a 7B class model. This would be about 75% of a 3090 for pp and 50% for tg (and of course, 50% of memory). For $250, that's not too bad.

I do want to note a couple things from my poking around. The IPEX-LLM [1] was very responsive, and was able to address an issue I had w/ llama.cpp within days. They are doing weekly update releases, so that's great. The IPEX stands for Intel Extension for PyTorch [2] and it is a mostly drop-in for PyTorch: "Intel® Extension for PyTorch* extends PyTorch* with up-to-date features optimizations for an extra performance boost on Intel hardware. Optimizations take advantage of Intel® Advanced Vector Extensions 512 (Intel® AVX-512) Vector Neural Network Instructions (VNNI) and Intel® Advanced Matrix Extensions (Intel® AMX) on Intel CPUs as well as Intel Xe Matrix Extensions (XMX) AI engines on Intel discrete GPUs. Moreover, Intel® Extension for PyTorch* provides easy GPU acceleration for Intel discrete GPUs through the PyTorch* xpu device."

All of this depends on Intel oneAPI Base Kit [3] which has easy Linux (and presumably Windows) support. I am normally an AUR guy on my Arch Linux workstation, but those are basically broken and I had much more success installing oneAPI Base Kit (w/o issues) directly in Arch Linux. Sadly, this is also where there are issues some of the code is either dependent on older versions of oneAPI Base Kit that are no longer available (vLLM requires oneAPI Base Toolkit 2024.1 - this is not available for download from the Intel site anymore) or in dependency hell (GPU whisper simply will not work, ipex-llm[xpu] has internal conflicts from the get go), so it's not all sunshine. On average, ROCm w/ RNDA3 is much more mature (while not always the fastest, most basic things do just work now).

[1] https://github.com/intel-analytics/ipex-llm

[2] https://github.com/intel/intel-extension-for-pytorch

[3] https://www.intel.com/content/www/us/en/developer/tools/onea...

By @Sparkyte - 4 months
Intel over there with two spears in the knees looking puzzled and in pain.
By @matt3210 - 4 months
But can it AI?
By @pizzaknife - 4 months
tell it to my intc stock price
By @Lapra - 4 months
Unlabelled graphs are infuriating. Are the charts average framerate? Mean framerate? Maximum framerate?
By @JimRoepcke - 4 months
"B-series", huh?

I'm guessing their marketing department isn't known as the "A-team".

By @treprinum - 4 months
Why don't they just release a basic GPU with 128GB RAM and eat NVidia's local generative AI lunch? The networking effect of all devs porting their LLMs etc. to that card would instantly put them as a major CUDA threat. But beancounters running the company would never get such an idea...
By @tcdent - 4 months
If they were serious about AI they would have published TOPS stats at at least float32 and bfloat16.

The lack of quantified stats on the marketing pages tells me Intel is way behind.

By @headgasket - 4 months
my hunch is the path forward for intel on both the CPU and the GPU end is to release a series of consumer chipsets with a large number of PCIE 5.0 lanes, and keep iterating this. This would cannibalize some of the datacenter server side revenue, both that's a reboot... get the hackers raving about intel value for the money instead of EPYC. Or do a skunkworks ARM64 M1 like processor; there's a market for this as a datacenter part...