It's time for the PCPer Mailbag, our weekly show where Ryan and the team answer your questions about the tech industry, the latest and greatest GPUs, the process of running a tech review website, and more!
On today's show:
00:37 – Ryzen vs. Coffee Lake for Meltdown/Spectre?
02:50 – NVIDIA CPUs?
04:35 – Selling bare GPUs for water coolers?
07:01 – Hard drive data recovery?
09:35 – Why does Apple use ARM?
11:22 – PCPer's advertising disclosure?
12:44 – Hard drive cloning software?
14:18 – PCPer history?
16:53 – Mobile World Congress?
17:26 – Pinnacle Ridge IPC improvements?
18:43 – Consumer cards for machine learning?
21:47 – Refining display technologies?
Want to have your question answered on a future Mailbag? Leave a comment on this post or in the YouTube comments for the latest video. Check out new Mailbag videos each Friday!
Be sure to subscribe to our YouTube Channel to make sure you never miss our weekly reviews and podcasts, and please consider supporting PC Perspective via Patreon to help us keep videos like our weekly mailbag coming!
ad. 02:50 – NVIDIA CPUs?
ad. 02:50 – NVIDIA CPUs? “Could NVIDIA get an x86-64 licence to compete with Intel and AMD?”
Nvidia could do it in the past but did not agree with VIA Technologies.
Nvision: Nvidia CEO Loves VIA’s Nano
link: https://www.youtube.com/watch?v=7LHB6bKnpn4
Then the VIA agreed with chinese join-venture company Zhaoxin. And today they have new x86/x86-64 SoC KX-5000 and incoming die shrink to 16nm FF TSMC Zhaoxin KX-6000 SoC and later with 1X/Xnm KX-7000 SoC. Compared against Intel and AMD, they designed high performance, low-cost x86 compatible microprocessors.
source: http://www.zhaoxin.com/Solution.aspx?id=3
source: http://www.ccf.org.cn/upload/resources/file/2018/01/26/51490.pdf
source: http://www.centtech.com/
The x86/x86-64 train was for NVIDIA gone!
I almost hate to ask… but
I almost hate to ask… but with the new Ryzen APUs, what’s the risk of miners buying these up? As I understand it, the CPU doesn’t do much in a mining rig, so the new APUs might allow them to squeeze some extra GPU compute units in. (I know you lose PCIe lanes with the APUs, but most mining rigs use PCIe x1 links anyway.)
Apple uses the ARMv8A ISA
Apple uses the ARMv8A ISA only but Apple chose the ARM ISA to get at the OS/Software ecosystem that is built up over the decades around the ARM Holdings ISAs.
So Apple is a Top Tier Architectural licensee from ARM Holdings and that license gives Apple the right take the ARMv8A ISA only and create a fully custom core that executes the ARMv8A ISA. Ditto for the other Top Tier Architectural licensees(Samsung/Others) from ARM Holdings that only license the ARMv8A ISA and create their own repective underlying CPU Micro-Archs that execute the ARMv8A ISA.
OSs and software/software ecosystem cost more to develop/maintain than the hardware. So Apple saved billions because the ARM ISA based hardware has decades of compilers, OSs and other SDKs and software tools available for Apple to use and save billions in not having to create that OS/Software ecosystem from scratch.
Apple was a vey early user of ARM IP, even before there was an ARM holdings(Arm Holdings was established besause of Apple’s using ARM IP) in the first place for Apple’s Newton(1).
“The Newton project was a personal digital assistant platform. The PDA category did not exist for most of Newton’s genesis, and the phrase “personal digital assistant” was coined relatively late in the development cycle by Apple’s CEO John Sculley,[1] the driving force behind the project. Larry Tesler determined that an advanced, low-power processor was needed for sophisticated graphics manipulation. He found Hermann Hauser, with the Acorn RISC Machine (ARM) processor, and put together Advanced RISC Machines (now ARM Holdings).[2] Newton was intended to be a complete reinvention of personal computing. For most of its design lifecycle Newton had a large-format screen, more internal memory, and an object-oriented graphics kernel.” (1)
(1)
“Apple Newton”
https://en.wikipedia.org/wiki/Apple_Newton
The Smartphones are already
The Smartphones are already getting AI to run the trained algorithms that are first trained on the powerful cluster computing AI systems. So Smartphone already have the Neural Network chips but the smartphone AI chips are too weak for the training part but they are perfectly able to run the algorithms once they are trained.
So PC’s could be getting AI functionality and all the TenserFlow AI algorithms can already be run on a GPU even without there being dedicated Tensor Cores available. AMD’s Zen has neural net branch prediction. DSPs also can do AI workloads in addition to GPUs and a lot of the camera features on phones/devices are using AI to some degree already.
Adobe Photoshop uses AI accelerated on the GPU for some of its advanced filtering features, and ditto for other Graphics Software. Look for games to begin to start to utilizing AI for targerted AA, shading, and even Ray Tracing/other effects acceleration as that’s being looked at also.
Sounds like the guy asking
Sounds like the guy asking about the G-sync monitor really just needs a free-sync/2 monitor.
Hi,I got bit of a strange
Hi,I got bit of a strange question I have got a old Packard Bell Pentium 1 PC with an AT Power supply which no longer works if I bought something like a corsair 450 watt modular psu and a ATX to AT power adaptor would it work or would I have to buy a replacement AT PSU Thanks
I remember AMDMB.com. I was
I remember AMDMB.com. I was also one of the first to sell the Asus mb way back when. It was sold in an unmarked white box because of Intel’s bribery.
Look for games to begin to
Look for games to begin to start to utilizing AI for targerted AA, gogoanime
Question for the next
Question for the next mailbag:
I have a Ryzen system that can accept one M.2 SSD. I’m wondering if there is an appreciable performance penalty to add an M.2 to PCIe adapter such as the one at this link.
https://www.amazon.ca/StarTech-com-M-2-Adapter-Profile-Express/dp/B01FU9JS94/ref=sr_1_17?ie=UTF8&qid=1519654166&sr=8-17&keywords=ssd+adapter
Thank you for your time!
Sorry – to clarify I want to
Sorry – to clarify I want to know about a performance penalty if I add a second M.2 with the help of the adapter.