cinera_handmade.network/cmuratori/hero/code/code236.hmml

53 lines
5.2 KiB
Plaintext
Raw Permalink Normal View History

[video output=day236 member=cmuratori stream_platform=twitch stream_username=handmade_hero project=code title="GPU Conceptual Overview" vod_platform=youtube id=vbnozKJM0Oo annotator=Miblo]
[0:07][Recap and glimpse into the future of streaming via a video capture card]
[2:06][Run the game and note that Casey can see the game and we can't]
[3:00][Recap where we're at]
[4:31][Blackboard: GPUs]
[7:00][Blackboard: How the CPU relates to the GPU]
[10:59][Blackboard: A typical mobile setup]
[13:19][Blackboard: The implications of the system RAM and graphics RAM being separated by a PCI bus, i.e. latency]
[15:39][Blackboard: CPU vs GPU, historically]
[22:42][Blackboard: A high level overview of GPU architecture]
[29:03][Blackboard: How a GPU core works]
[31:38][Blackboard: "Shader" != "CPU code"]
[33:05][Blackboard: How if statments are executed on a GPU]
[36:46][Blackboard: How loops are worked through on a GPU]
[37:51][Blackboard: "Warp"][quote 353]
[38:28][Blackboard: Summarise what a GPU is]
[41:16][Blackboard: How we program for the GPU]
[46:14][Blackboard: "Pushbuffer"]
[53:26][Blackboard: Plan for tomorrow]
[56:38][Q&A][:speech]
[57:04][@ratchetfreak][You can instead of two triangles use a single triangle twice as big. It avoids the overdraw at the diagonal]
[58:33][@kknewkles][Quick CPU question: All it does on barebones physics level is run electrons through if-statements, right? (Go left, go right, etc., with transistors)]
[59:34][@NoRaD91][Can't you run a Xeon Phi as your main processor, or did they make them dedicated only? Not like price / performance makes sense there]
[1:00:15][@ChronalDragon][What barriers are there currently from just using the CPU as a GPU?]
[1:01:49][@aceflameseer][Can we make a "first person" 3D mode of the game, just for education?]
[1:01:55][@cubercaleb][Where does the PS4 / XBone processor lie on the CPU / GPU spectrum?]
[1:04:18][@quartertron][Intel has killed off many projects that made good money but had bad margins]
[1:05:56][@Longboolean][I've been told that lots of graphics drivers optimize for specific games (at the driver level). How does this fit into the equation? How do those optimizations make some games run better?]
[1:08:05][@sssmcgrath][Why does everyone good that works at Intel hate Intel, yet simultaneously Intel's engineering is so far ahead of everyone else's? It doesn't compute!]
[1:08:36][@garryjohanson][What did you think about Larrabee?]
[1:08:56][@chr0n0kun][Does Vulkan fix the problem with OpenGL of not being able to transfer buffer-objects between processes with separate address spaces?]
[1:09:36][@angus_holder][Intel's compiler is meant to be really good, right?]
[1:10:11][@Andremm2][Can you give us some insight (without breaking any NDAs) how different from OpenGL console graphics APIs are?]
[1:12:26][@chr0n0kun][Sharing objects between applications without CPU overhead]
[1:12:43][@kknewkles][Why do you think there's no games about programming / hardware / history of PC / hardware? The domain is unimaginably rich]
[1:13:22][@hguleryuz][How does GDDR RAM for GPU or concept of "memory chip designed specifically for GPU" enter into this picture?]
[1:13:48][Blackboard: The gist of GDDR]
[1:16:36][@NoRaD91][Could you one day maybe do a pre / after-stream short summary about your thoughts on OS-design and what you would do differently given current hardware?]
[1:16:57][@kknewkles][I mustered up one more. How come The Witness has 4GB RAM as minimum requirement? I don't doubt it has great optimization (as Jon is an apex-level programmer). Is it because nowadays everyone has 4 gigs at least and they thought it's unfeasible or too limiting to go below that? What can be the design behind that requirement?]
[1:19:47][@elxenoaizd][Why does it always seem like PC games claim that they require much more hardware power than they need? Do they want the extra power - just in case something goes wrong - or what?]
[1:20:59][@chr0n0kun][Having OS-level support for GPU resources for compute and graphics so that 3D graphics tools etc. can interoperate efficiently, e.g. in VFX production where you have lots of tools using the same data]
[1:21:45][@Longboolean][What would be a disadvantage of having a big beefy cpu (if any) or are there none?]
[1:22:00][@Boorocks998][Have you seen that guy that is going to recreate Quake in a Handmade Hero style?]
[1:22:21][@pankupunka][How long do you expect this project to take?]
[1:22:48][@kknewkles][From what I get, Fallout 4 just allocates itself an 8 gig block. Abhorrent]
[1:23:17][@kil4h][Quick question about strict aliasing due to your forum post (not defending it), but how would you propose compilers to understand that pointers do not overlap (to optimize loads and so on)? Not sure how we could improve generated code without having that kind of guarantee, assuming we need to still support old code]
[1:25:27][@Andremm2][Out of curiosity, back then when it came out, was GDI just a wrapper for OpenGL?]
[1:26:29][@cubercaleb][Restrict doesn't seem to work with vs2013 / 2015]
[1:26:48][@NoRaD91][Isn't Restrict, like, stupid limited though? And you don't have Alias (one that's definitive at least)]
[1:27:18][@Neitchzehrer][Why is it that games get more difficult to play as Windows OS gets more advanced, i.e. playing a game from Windows XP on Windows 7?]
[1:28:00][Wind down][:speech]
[/video]