(评论)
(comments)

原始链接: https://news.ycombinator.com/item?id=38240861

简而言之,Razer 联合创始人 Min-Liang Tan 在直播中宣布,该公司将正式进军 PC 市场,推出三款新游戏笔记本电脑,分别为 Blade Stealth 15、Tomcat Sixteen 和 Raptor。 这些型号拥有独特的功能,包括定制 PSU 导轨、键盘选项和优化的冷却系统。 此外,Raptor 型号还配备 120Hz FHD 显示屏和 AMD Ryzen CPU。 根据该文章,这些产品可与 Dell Alienware 的 M 系列笔记本电脑相媲美。 此外,英特尔首席执行官 Brian Krzanich 提到英特尔计划将 AI 加速器直接集成到未来的芯片架构中,目标是到 2020 年将性能提升 3 到 5 倍。关于 Valve 即将在第四季度发布的 Steam Deck,因其设计流程而引发了人们的兴趣 和有趣的规格。 该控制台配备了扩展坞、USB-C 端口和 Wi-Fi 6 连接,并且可以输出 720p、1080p 或 1440p 的分辨率,具体取决于玩家使用的是 6 英寸、8 英寸还是 14 英寸 附加屏幕。 值得注意的是,它采用 AMD APU Zen 2 处理器,能够在密集型任务中处理高达 50 瓦的散热,并且电池续航时间可持续数小时。 最后,Valve 最近收购了一家以生产超薄显示器而闻名的初创公司,这可能暗示他们正在努力为潜在的笔记本电脑集成创建更薄的面板。 关于 OLED 烧屏问题,幸运的是,现代 OLED 技术采用的算法可以计算每个像素的使用历史并相应地调整亮度级别,以确保使用寿命并降低烧屏风险。 最后,Valve 对开源原则的承诺继续体现在 Linux 对 Steam 的支持上,提高了平台内众多游戏的兼容性和性能。

相关文章

原文
Hacker News new | past | comments | ask | show | jobs | submit login
The Steam Deck OLED spot ad was made with Steam Deck OLEDs (idlethumbs.social)
375 points by neffo 16 hours ago | hide | past | favorite | 56 comments










Gorgeous! It's a shame that CGI is so prevalent, because without context I would totally assume this was CGI. Some of the shots would fit right in with a Portal cutscene.

Imagine playing on the one in the center of the orb. It's the ultimate Steam Deck gaming throne.



> Some of the shots would fit right in with a Portal cutscene.

The music really helps sell that idea.



This is awesome. Video production just takes so many hours for each second and I don't think people know how much work it takes until you help out on a shoot.


It's why AI-generated movie shots would make a lot of sense. Hollywood spends billions of dollars, builds and blows up elaborate sets, and hires 100,000s of thousands of people... just to be able to have pixels move in a pleasing way. How much of that will be cut out when we just go straight to generating pixels? CGI goes in that general direction but it's still very labor intensive.


I do a lot of YouTubing and the consensus is that every minute of video takes at least 1 hour of work - and this is just amateurs messing around.


In the pro world it's common for one person to spend hours on seconds, multiplied over dozens of people working on those same seconds.


When I see things like that I always think about their lifecycle. The frame will be stripped of the devices for another display. It'll still hang around in some honored spot, then they'll get new staff and it will be shunted into storage. In 3 years it will be disassembled and tossed into the dumpster because no one will have room for it at home.


I work for a company which has made a huge satellite constellation and after a revision of a hardware component has run its course thousands have made their way to space, incomplete/failing units are shredded, and only a handful are kept. When the satellites eventually burn up in the atmosphere those ~3 builds are all that remain of the entire project, thousands of man-hours to engineer and build hardware that the world will never see. I think about those few survivors a lot.


Oh the pain


We know you're talking about spacex


Props, acts and jigs are always like this. In reality, a small metal structure like this is not really so precious that it need be cherished or repurposed. Metal is very recyclable. It’s function was brief in time, but relatively high in impact.


is there something inherently bad in this? do you feel something is lost?


I do feel a twinge sometimes. I tend to get a bit too sentimental about artifacts.


I love how each of the decks being a proper computer made this much easier to pull off. Like, driving a display in a dummy device would probably get you the same effect, and take longer. But that's likely the route a third-party production company would take.


Part of me thinks this crazy, the other part wants to see this running doom with 360 vision.


On the note of OLED increasing battery life, does anyone know the max number of hours of gameplay you can get on a single charge when the game is as simple as "snake"?

I have searched for answers on reddit and Google etc but I have only found the number to be 8hrs which is not for low power games. Given steam runs Linux and is hacker friendly, one should be able to juice it for much much longer if only early retro games are played.



8hr sounds accurate. There’s a wattage slider and you can only set it so low. You might be able to get more out of using the desktop and forcing some idle modes, however it’s impressive enough as is at a few watts and many hours on a portable Linux machine occupying less total space than a 13” laptop.


Norm from Tested has a short vid on Twitter of him in the sphere:

https://twitter.com/nchan/status/1722688222713749881



I am continuously amazed at how awesome valve is and what awesome products it makes. Also steam supporting linux single handedly advanced adoption of that os. Apparently the best types of companies are those founded by deeply technical people, still owned and run by them, no venture capital. In an ideal world we would favour such companies over toxic ones.


Without taking away from the great things they are also doing, I'm mostly amazed at how bad and laggy the Steam app is, and has been for years, on all 3 platforms I've used it on.

(I'm sure there are loads of people who have never had any issues but to me that's like people saying there are no problems with Linux on the desktop because they don't have any problems.)



It could be improved yes, but that’s an acceptable kind of issues. Steam’s important functionality works well and is reliable - package management for games. It doesnt spy on me, doesn't push crap, doesnt use dark patterns. We need more apps like steam tbh.


And they only take a 30% cut! Practically altruism.


What is an acceptable fee structure for the service they provide?


That’s messed up indeed.


>Also steam supporting linux single handedly advanced adoption of that os.

Android using Linux is what single handily advanced the adoption of Linux among consumers.



Adoption on desktop is 100% more related to Steam support, at least in the gaming segment.

Anecdotally, among friends and colleagues, people are only staying with Windows for gaming support.

People generally dislike Windows but are forced to stay their for gaming. As support for Linux improves, they'll be less willing to put up with Windows' BS.



Saying Android is Linux is like saying macOS is BSD.


No it isn't.

Android is as much a Linux distro as any other.



Android is a Linux distribution unlike any other. They use a completely different framework for drivers for hardware, they have their own patches for binder and other things that see no use outside of Android and will never get upstreamed. They have their own libc which is not used in any other distro. Very little of the work that has been done to make Linux work better on Androids has benefited the rest of the Linux ecosystem. Even the WiFi/Bluetooth drivers, which is a massive shame.


>They use a completely different framework for drivers for hardware

No, both use kernel modules or statically compiled code for the part of the driver that actually talk to the hardware.

>they have their own patches for binder

Binder is part of mainline Linux, but yes I guess technically there are some patches that are related to binder, but remember that Android works on a mainline kernel.



Android uses the Linux kernel and keeps up with upstream to some extent.

macOS uses the XNU kernel.

Though as a user that likes having control over the software, I recognize that not having GNU/Linux being number one is a bit of a waste. (though one weekend of fighting NVIDIA and wayland tamed that quite a bit. Somehow my DE does not load with the proprietary driver unless I also load nouveau for some strange reason).



Android uses the latest LTS kernel and works using a mainline kernel provided mainline supports the hardware you are on.

>though one weekend of fighting NVIDIA and wayland

Wayland is freedesktop software which is different than GNU.



Not as a desktop.


Do we know who's the supplier of the OLED panel? Samnang? LG? BOE?


Samsung, but BOE may be dual-supplying it: https://twitter.com/SadlyItsBradley/status/17227592388066431...


Last I heard, it's the same supplier as the Switch OLED; that's why it uses MIPI over eDP. Samsung is the manufacturer I think


That's what I've heard too. And yeah, Samsung makes the Switch's panel according to iFixit's teardown.

https://www.ifixit.com/News/53272/nintendo-switch-oled-teard...



(GP)> "same supplier as the Switch OLED"

> That's what I've heard too

Is any of this confirmed or just rumor from the LTT video?



The other comment chain (2 levels above but in the same first-level thread) links to this tweet with a code screenshot that seems to confirm it: https://twitter.com/SadlyItsBradley/status/17227592388066431...

    if ((vendor_product->product == GALILEO_SDC_PID) || (vendor_product->product == GELILEO_BOE_PID)) {
        // ...
I'm assuming SDC ("GALILEO_SDC_PID") is Samsung Display [Corp?] (a search seems to confirm that's their acronym).


I got pretty excited that it was Jake Rodkin posting this on something with the Idle Thumbs name.


Ultra-bright means higher burn-in risk, or am I wrong on this one?

Edit: thanks for clarifying



Only if you drive the display at the top end. If you end up driving in the middle, it helps prevent burn in

> By counting the time each subpixel is displayed and at what brightness, a "wear level" can be determined for each pixel, using an algorithm to estimate the luminance degradation this can be compensated for. However, to do this, you must have some spare luminance headroom that gets utilized as the display gets older. Or alternatively, if the display unlocks full maximum luminance when new without saving any headroom, the algorithm would dim the other pixels over time to bring them down to the level of the burned-in pixels, so the peak luminance of the display would diminish over time as the burn-in occurs.

https://arstechnica.com/gadgets/2023/11/why-oled-monitor-bur...



Do we know if the steam deck has this feature?


Fortunately in case of Steam Deck it's possible to replace screen in case this become a problem in a couple of years and new parts wont cost like a new device.


I hope replacement is easier than with the OG Steam Deck because to replace the screen you have to completely disassemble the device, heat to unglue the screen...

https://www.ifixit.com/Guide/Steam+Deck+Screen+Replacement/1...



It's one of the improvements they listed regarding repair-ability

> Improved display repair/replacement to not require taking rear cover off



Does it still work as a very efficient SD card cutter though ?


It does.


They have made a number of changes to makebit easier to fix, not sure about the screen though.


> makebit

I make this same mistake on iPhone all the time. Is it just me or does Apple need to step up their keyboard and autocorrect game?



Burn-in on OLEDs is really just uneven wear. I don’t think it really matters how bright they get for that, unless panel heat is an issue at higher brightnesses.


Higher brightness leads to faster wear. If the wear is uneven, this leads to faster burn at higher brightness.


Burn-in on CRTs was just uneven wear, too, but still a pain in the ass.


I remember encountering CRTs burnt in so badly it was hard to read stuff in the worst areas (e.g. the taskbar clock or login prompt) but I haven’t encountered anything remotely close to that with current OLEDs. My iPhone and TV have no signs at all, and the last device I used that had legitimately easy-to-detect burn in was a Nexus One test device that sat on my desk with the screen on all day every day while I built an Android app in 2012.


Damn, no call out to DeckMate by name in the thread? Sad


I assume that's what "some Steam Deck clips made by an accessory designer on Reddit" is referring to? Going back and watching the video again, yeah, it definitely looks like Deckmate grips holding all the decks to the orb frame.






Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact



Search:
联系我们 contact @ memedata.com