This episode Dave's been blogging about the tangled webs we weave with dependancies and the internet, we talk more about web workers, and making peace with production code written by your earlier self. And Jeremy Keith kicks off part 1 of a reading of the history of the web.
Time Jump Links
MANTRA: Just Build Websites!
Chris Coyier: Hey!
Dave Rupert: Hey there, Shop-o-maniacs. You're listening to another episode of the ShopTalk Show. I'm Dave, homeschool happening in the background--Rupert and with me is Chris--in the big smoke--Coyier. How are you today, Chris?
Chris: Oh, good. Good. Good. Yeah, just -- it is -- it is damn smoky out there. Just one more terrible thing to add to this crazy year.
Dave: 20-frickin'-20. My brother lives in South Dakota and it was 100 on Saturday when I talked to him. Then it was 15 degrees on Tuesday and snowing.
Dave: I got to applaud South Dakota for just saying, "Let's get this year done with."
Dave: Let's just skip Fall entirely. Just wrap it up. Get out. Go.
Chris: Well done, South Dakota.
Dave: Advanced weather technology there, so they're thinking ahead.
Chris: That's nice. Yeah, I've been just finding some solace in work, really, for a long time now. Just be like -- I don't know -- I'm just going to do what I can each day. I mean that's what I've been doing for my whole life, so why not just keep doing it now, you know?
Dave: Well, I heard somebody say. I think it was Melanie Richards from Edge was saying, "What are you looking forward to other than work?" I just was like, whoof.
Dave: [Laughter] Um… Oh, boy. Might go for a bike ride on Saturday.
Dave: It was a tough question because your options for doing stuff is so limited.
Chris: It really is. I read Jeremey Keith's -- he does excellent movie reviews. That dang bastard. Everything he does, he writes up so well. Just gold star blogger, not to mention you've been a damn gold star blogger lately. Congratulations.
Dave: I've had at least three posts. Yes. [Laughter]
Chris: Yeah. No, it's like every day there's something good to read. Then I always have extra thoughts, so I'm going to be blasting you with link posts as we go here.
I just quoted you today, actually, because it was about the tangled Web we weave was a good one. I think you might have even alluded to it on the show about how you start having problems with this CMS preview thing, but really you end up touching 20 technologies along the way. Oh, my god!
Dave: Yeah. I mean it was 11ty, Tailwind, Netlify, right? Netlify CMS, I should say.
Dave: These technologies kind of all go together. You can use Netlify CMS with 11ty. Tailwind doesn't care what your CMS is. But getting it into 11ty is a little sticky, but people have done it.
Chris: It does require a build process, right? Otherwise, you're not really reaping the benefit of Tailwind if you're shipping the whole package.
Dave: Yeah, and Tailwind is unique in itself. It's an NPM package. It's a CSS Sass-like structure. There's post-CSS, so you run it through auto-prefixer and purge. Then you have to kind of write a Node script that does that while doing your 11ty compilation, you know, like npm-run-dev or something like that.
Dave: There is actually a really good post on CSS-Tricks on how to do that and there are a lot of good starters.
Dave: I followed the CSS-Tricks post and I was like, cool. Then it was getting into the Netlify CMS part. I was like, oh, boy. I started just pulling things from Andy Bell's Hylia to pull over little pieces to get the Netlify CMS. As simple as that is, there are a few gotchas there.
Dave: You have to end up writing React preview components to get the blog post preview to work. Does that make sense?
Chris: Yeah. You've done something I've never done with it, which is, get it working locally.
Chris: There is a way you can use Netlify CMS where you just say, "Screw it. This CMS, it works on prod but not on dev."
Dave: Right, and so I was trying to get it to work locally.
Chris: It's very buddy-buddy with Tailwind, for sure. Yeah.
Dave: Yeah, it's the best friend of Tailwind. It's Tailwind's best friend. [Laughter]
Dave: I was using Alpine and then, kind of by default, the examples from Tailwind UI, they're all open. All your menus are open, and so I load up the Netlify preview and all my dropdowns are open. Everything is open. I just was like, "Ooh, that's wrong." You know? I figured out later-later that I could just put style-display-none because that's all Alpine does when it detects it has to do something.
Dave: It just says "display none" on an opener. But I was just like, "Oh, well, Alpine is not working in the Netlify preview. That's weird. Let me debug."
Long story short--this is already long--I'm 20 dependencies deep and I'm looking at React Frame Component, which is like an iFrame builder injector thing. My Alpine script was not injecting inside the iFrame. I could inject it outside the iFrame but not inside the iFrame. Just do a limitation of where the components, because the page, the preview component was building the outside, not the inside, if that makes sense. It's kind of confusing, or it was passing the insides to this rack frame component.
Again, I was just trying to do two things, three things: CMS, 11ty, and Tailwind. Here I am. I'm digging into React source code for like third party components and stuff like that. I don't know. For me, I love this stuff and I am still very bullish on the JAMstack and all that, and Netlify is still a very great product. Thank you for sponsoring. But it was having, like, man, this is tough. This is just symbolic of the problems I'm having, in general, with the modern Web dev. It's not picking on these tools specifically. It's just in general.
Chris: Yeah. Yeah. That kind of early days feeling. When I read that, I just happened to be linked up because I read WordPress news as well. There was a post on the new stack IO or something that was kind of like a conversation between this guy and Matt, you know, WordPress Matt.
Matt didn't have terribly kind things to say about JAMstack, which, on the surface, you can be like, well, why would he? You know? Matt is not particularly incentivized. He's incentivized to back up the WordPress approach, but that's not that kind of guy. I think he's a technology guy. I think, in this case, it wasn't an ungenuine response, but he was kind of pointing to one particular aspect of JAMstack that you were kind of pointing at, too, in a different sort of way. But you've heard this criticism before, I think, is that it's kind of a house of cards depending on how much more stuff you need.
JAMstack, on purpose, is saying, "We don't do all the back-end stuff. We're not your data store, necessarily. You need to find some other thing to do data store. We're not your API. You need to user Serverless or something for your API. We're not X, Y, and Z on purpose."
Chris: That's the deal. I don't know. There are lots of reasons for that. He was kind of being like, hey, you might end up stitching 12 services together and then they're all as important as the next one. Your chain is only as strong as the weakest link, et cetera. Just fair enough, you know?
But I don't think Matt is wrong, necessarily. Although, 12 is a weird number to throw out. If I was evaluating a stack of, like, how am I going to build this thing, and I came to the conclusion that I need to stitch together 12 services to make this thing JAMstack, I'm an adult. I'll just be like, that is a house of cards. I'm not going to do that.
Chris: I wouldn't pick JAMstack in that situation but I would in all kinds of other situations. It doesn't make JAMstack bad. It makes house of cards, 12 services, JAMstack a bad choice sometimes.
Dave: Right. Right. In some ways, it's the exact same problem but on a B2B scale. I'm trying to stitch three Node packages together. The problem is huger if you're trying to stitch three businesses together.
Dave: This is my database business. This is my--whatever--search business. This is my--whatever--product AI business. Those are--
Chris: Yeah. Yeah. You know what's funny is Netlify already knows this. They've already solved a lot of this. They know you need help with your forms. They know you need help with auth.
Chris: They know you need help with your serverless functions. They do all that stuff. It's not exactly a house of cards when you're using a lot of Netlify services. It's Netlify.
Chris: Just like WordPress, you know. They know you need X, Y, and Z, so they give it to you, and there are certain things they can't. Even if you sell stuff on WordPress, well, great because they have some in-house tools for that. You're not leaning on a million things, but they are not a payment processor. You still need Stripe.
Chris: Just like you need Stripe on the JAMstack, so it's still a chain of stuff you're stitching together, to some degree. It's just interesting, you know.
Dave: Yeah. Well, and it kind of doesn't acknowledge that people are already using service, like you're saying. I was already using Stripe for my payment, so I had WordPress and Stripe or whatever, and I was already using WooCommerce or another service for--
Chris: Yeah. I already have Cloudflare in front of it.
Dave: Yeah, already. Yeah.
Chris: Something for deployment that's different.
Dave: Yeah. We have, what? We use local by Flywheel to deploy our WordPress, so we have a service through Buddy.
Chris: Through Buddy.
Dave: We have two services just to spit out our WordPress.
Chris: But Buddy connects to GitHub.
Dave: Four services.
Dave: [Laughter] How many more do we have to go?
Dave: We use, what, our MP3s. They don't even go into the WordPress. They go into Simplecast.
Dave: Five services. If I were to critique Matt's critique, it would just be that we're already using a lot of different things and gluing them all together.
Dave: Yeah. Then I think Matt Billman from Netlify wrote a response.
Chris: Did he really? I didn't read that one.
Dave: Yeah. You should. You should. It's over on the Netlify blog, but it's just sort of like saying, I think, what you're saying. It's just like, "We know there are limitations." But they came from a client world where they were building websites and just like, "Dude. There has to be an easier way to build these sites. Not every site just needs all this MySQL database."
I agree. I get really far with JAMstack projects. I'm looking at another project here that's taking somebody off a database. You know?
Dave: It's like -- I think a lot of people are like, "I would just rather code this than to do this whole CMS mumbo-jumbo." If you have the skills to do that, the literacy, there is a literacy problem, but if you have the literacy to do that, it's great.
Chris: Yeah. Yeah. I think a stance I will take, rather than just being like, "Oh, it's gray," my stance is that it's gray, in a way. It's that Matt's going to defend WordPress and he's not wrong, Matt Mullenweg.
Chris: Matt Billman is going to defend Netlify and he's not wrong. It's almost like I don't care that much about their opinions because they're neck-deep. They're 50 necks deep into what they're doing. They better believe in it. You better believe in it. I want you to believe in it.
Here's me on the outside who has production projects in both being like, these are both good solutions for different situations.
Chris: Listen to me, people! [Laughter]
Dave: Listen to me and only me.
Chris: Yeah. [Laughter]
Dave: On that data point, one of the problems I'm having -- I have a few Nuxt Vue projects kind of prototypes built out. I'm like, okay, now I need to get it into a database. I could always just fire-base it because whatever. Google just wants my money forever.
Chris: Yeah. I wouldn't fault you there.
Dave: Well, I know that path. I've done that path. It's pretty good.
Dave: But I know that path. The other thing would be like going down -- looking at all these databases that provide a GraphQL kind of endpoint.
Chris: Yeah. Yeah. Yeah.
Dave: That's what I'm looking at and I've looked at Hasura, Prismic, Fauna.
Chris: Fauna. Yeah.
Dave: I'm just looking at all these things and each have their pluses and minuses, but you hit a little roadblock on each one of those little services. You're like, oh, this one doesn't give you the GraphQL schema. You just have to make that up, or like, oh, this one is very cool. It's on Postgres. I know Postgres, but I have to manually configure every single table and access to every single field. I'm just like, ouph, could that go easier?
Dave: The other one is like the GraphQL that spits back is all weird, you know?
Dave: I'm just like, there's no--
Chris: Sanity is cool too, but then they have a special query language you have to use or you should. They really encourage you to use.
Dave: Yeah. Yeah, and Sanity is on my list, but it's like, oh, should I--? But I know about the GraphQ-like syntax.
Dave: Which I guess I'm not opposed to, but I guess I just was like, if I'm going to bite off GraphQL, I might as well chase down GraphQL, but each one has been a very interesting learning experience.
Chris: Mm-hmm. Well, they're funny because they're right. Their little language is better. It's cooler. It's more powerful.
Chris: But it's also just theirs, only.
Dave: Sure. Sure.
Chris: If you had a schema, Dave, if you're doing a GraphQL route and you can stomach the idea of writing your own schemas and dealing with that, at least then it's portable.
Chris: Yeah, which is, you know, you may not regret that. You could get that over on some AWS thing at some point or bounce between services because it's just describing what your data needs are and that's pretty transferrable in a way that some of these other choices are not. Nothing is like Firebase. Firebase is its own thing.
Dave: Well, and somebody -- Charlie Peters was saying, "Just spin up a Mongo somewhere," I assume like Heroku or something, "and then get some ORM and some resolvers and everything." They're not wrong. That's a great choice, but is that more than what I want to do right now? I really just want the GraphQL endpoint because I want to start querying. I want to get a list of pages. You know what I mean?
Dave: That's objective A. Just give me a list of pages. But, hey. I'm getting there. Long and winding road.
Chris: Well, I think we're pushing to that. Look. I mean can you imagine the conversation we just had, like three years ago, listening to us talk? Are we even front-end people anymore? What is happening?
Dave: Step one: Build a whole entire Rails app. Step two-- [Laughter]
Chris: Yeah, we're talking very back-end--
Dave: --Configure. Write the API all by yourself. It's a Rest API. You know? But now we're just like, "I want this but I want it easy," is sort of my problem. You know?
Chris: Yeah. Yeah.
Dave: Easy. Portable.
Chris: Or, like, can you imagine if you, Dave, didn't have to do that part? If there was somebody else who you were working with that's like, "You know what? I live and breathe this data storage API." They'd be like, "I would like to set that up for you. I think I can make smart choices when it comes to what services to use, how to configure it, what the pluses and minuses are. That's my specialty. Dave, you need an API for some pages? I got you, bro."
I think that's how, in a perfect world, projects like this should go. It shouldn't fall on the shoulders of the same person who is then worried about supporting responsive images and making the CSS Grid layout for the thing.
Dave: It does feel like another job. Honestly, it's like, do I need to hire somebody or pay somebody? But then it's like, "Okay. How much is it going to cost?" I want you to go to this website and scream into it and I'm going to record it and save it. [Laughter] It's called Void Scream.
Dave: You know? Is it really worth $10,000 or whatever I'm going to pay somebody to hang around and let me--whatever--make junk apps? I don't know. That's the thing, but it's starting. It's enough to where it starts to feel like another job because I really don't care what it is. I just want it to work.
Dave: I want it to scale.
Dave: I want it to support 200,000 screams. You know?
Dave: But I just don't know.
Chris: Great idea.
Dave: This is my best app so far. Wouldn't it be cool if you went to a website and it's just whatever post-process screaming howls like [growling]?
Dave: Just 2020 in a nutshell.
Chris: Oh, my god.
Dave: We'll put it on CodePen. We can put it on CodePen.
[Banjo music starts]
Chris: This episode of ShopTalk Show is brought to you by Automattic, super long-term sponsors of ShopTalk Show. High-five. Super appreciate the support.
Automattic makes all kinds of stuff you know. They make WordPress.com. They make the WooCommerce plugin, which is e-commerce for WordPress. They make Jetpack, which brings all kinds of powers to your WordPress site.
You know CSS-Tricks is a WordPress site and I use Jetpack on it for all sorts of things. Here is one that's gotten even a little better lately that I really like is the social media integration. If you're a publisher, like we are, or blog for any reason and you have a Twitter account that's for that site, you can have it so that when you write a blog post and hit publish, it's automatically sent to social media networks.
The way that works is you go into your settings in Jetpack and you make the connection, you know, authorize Twitter, Facebook, Tumblr, LinkedIn. That way, when you hit publish on that post, it goes to all the ones that you want it to go to, which is pretty cool, I think. You have a lot of control there. You have little toggle switches. If you want to not send it to any of them or want to send it to specific ones in certain cases, you have all that control, as well as the messaging. If you have a tweet that you want to send with this post, you can type it right in there and then, when it tweets, it'll tweet that, what you've written, or you can do nothing, which is what I often do. Although, I should probably do better. You'll still get the title of the post, the link, and the featured image as the card. That's nice, too, is that when you use the featured image feature of WordPress, which I do, to attach a nice image to each post so that it shows up nice in search results and the metatags and all that stuff, that becomes part of the card for social media, too, which is nice.
This is an improvement. Now you can hit a little thing and it brings up a modal to show you a preview what it's going to look like as a Google search result. What is it going to look like as a Facebook card or a Twitter card and all that? If it doesn't look right, you have the opportunity to change it and make sure that it does before it goes out. That's just good publishing and not terribly much work. It's just a nice little feature of Jetpack, one of many, many things that it does. That's cool.
If you want to work at Automattic, they always have job openings too. I'm looking at one right now for a security research engineer specifically on Jetpack. If you're into figuring out XSS and worrying about social engineering, hijacking, and all that stuff we hear about, go work for them. It's a great remote job. We'll have the details in the show notes.
Thanks for the support.
[Banjo music stops]
Chris: Just to attempt to connect this back to where we came from, I was talking about Jeremy Keith blogging and then I was like, oh, speaking of blogging, you're good at blogging. You started talking about the tangled webs we weave, which meant we started talking about JAMstack and the whole world, like we always do, of all that, and got all the way down to GraphQL schemas. If we circle it all the way back to Jeremy--
Chris: --his blogpost was about -- he blogs every day, four times. You know, what a masterpiece. One of them was on Tenet, the new--whatever--super thriller movie, which very much is up my alley that I want to see.
Dave: Oh, yeah.
Chris: Because we were talking about looking forward to things like Melanie Richard and your bike ride on Saturday, I'm looking forward to seeing that movie and I might just do the same damn thing Jeremy did.
Chris: Just pick a low traffic time to go see it and wear my mask and just do it, god dang-it, because I want to see that movie.
Dave: Yeah. Balancing risk and rewards, you know.
Dave: You just go out and mask up. Do it safe. No, that's not a bad plan. Just hit the 10:30 a.m. show for a horror movie. [Laughter]
Chris: Yeah. Although, I did look it up and I don't think ours is doing it yet. Ours is doing evening only, which is tricky. I'm not sure what the lowest thing will be.
Dave: Maybe it's go on a Tuesday instead of a Saturday.
Chris: Ah, there you go. Possibly. Possibly. Possibly. I wish they were doing noon shows. That would be nice.
Dave: My wife is literally kicking me out of the house this weekend. [Laughter] She said, "You have to do something not on your computer."
Dave: I was like, "Uh…."
Dave: I'm out of ideas already, so anyway.
Dave: We'll figure it out. I'll just go to Home Depot and buy tools.
Chris: There you go.
Dave: That'll show her.
Dave: Now I have a planer in my house.
Chris: One of my standards is to just go in the driveway and look around, you know. I usually end up cleaning the car, vacuuming it, and that kind of stuff.
Dave: Oh, that's a trick she puts on you. Yeah, that's good.
Dave: You have to do something. You're like, okay. I'll go -- well, it's interesting because most of the things I think of are tasks, right? Okay, I have free time. I'm going to do tasks, like different things I need to do. But that's not the point. The point is just to go do nothing, stare at nothing, so I've got to figure that out. Maybe I'll go for a hike or something.
Chris: Mm-hmm. Mm-hmm. Even that's kind of something-ish.
Dave: Yeah, a little bit. Yeah.
Chris: Okay, so we've developed -- we should -- it'd be cool to have a list of these at some point. But in the last five-ish episodes, we ended up talking about Web Workers a little bit.
Well, I started thinking about this again today because--and I've talked about this before--I have a WordPress site where I used Gutenberg, because I love Gutenberg, to make these fancy landing pages. But I want to publish them somewhere else, so I have that other site do an Ajax request for that content and it plops it onto that page. But I've decided that would be a very silly thing to do client-side, so I wrote this Edge Worker at the CDN level to do that Ajax request for me. I'm still excited about that idea. I think it's cool. It's working for me.
It's been set it and forget it since I did it. I set it up and it just works all day long. It's great. Which made me think about another philosophy. If you can, if you know that the page you're about to load is doing some client-side Ajax request at some point in its process, maybe the philosophy would be like, "If you can, do it at the worker level."
Chris: If all it does is--I don't know--request some user data or something so it's available, just philosophically ask yourself if it could be done at the worker level. There's no doubt about it. A CDN Ajax request to other servers is going to happen way, way, way, way faster than your client will do it.
Dave: Server-to-server. Yeah, server-to-server because, even with the phone--
Dave: --you're putting a literal radio call from a battery-powered device to a cell phone tower--
Dave: --that then hits the network, then hits a server. But when you do it in a Cloudflare or whatever Edge Worker, it's just like, cool. It took me two milliseconds. I apologize for the wait. You know? [laughter]
Chris: You joke, but it's like that. It's literally like two milliseconds. It's so fast.
Dave: Yeah. I'll take that penalty any day.
Dave: I hadn't really thought about an Ajax request, like, kind of whatever.
Chris: Anything that's Ajax.
Dave: I mean would you do that for, like, auth?
Dave: Like pre-auth a page, kind of sort of thing?
Dave: Hmm. Interesting.
Chris: It depends because you don't have much data at the URL level. You have what's in the URL, so if you can get something in the URL, then you can use it. You know?
Chris: But you also have just whatever your server -- hmm. You know. I don't know. You've got to think about it. But they also do have storage at the edge too. Cloudflare does, specifically, and they're just, in my opinion, a little ahead of the game and other companies are going to start doing this because it's just an obviously good idea. When you have storage there out at the edge, too, that can be kind of cool. For example, if you wanted to bet the farm on this, you could put your auth there.
Chris: The storage at the edge is where you know about users and information about users.
Dave: Yeah. Interesting.
Chris: So, your auth could happen there, potentially.
Dave: You know what's cool? Functions. Are functions a part of this whole -- or you're specifically in workers?
Chris: Yeah, but they … Node, you know?
Dave: Right. Yeah, well, that's what I like about functions. I'm thinking about this GraphQL stuff. It's like a lot of the examples online are like, you know, user sign-in and that's OAuth and that's their request and all that. You know? Which is good. That's a good security practice. Users have to auth in.
But I was like, you know, for some of this it could just be if I just had a key, but you don't want to expose your keys, as a developer. But then I was like, oh, functions, cloud functions, or Netlify functions.
Dave: I could just -- I have the key there, so I just have to fetch my function.
Dave: Actually, all my GraphQL now is in a function. Anyway, it was just very interesting to me. There are a lot of different workarounds you could do. There are a lot of places your little authentication or how workers are kind of working. It's just interesting.
Chris: That's why this is a paradigm shift is because that's going to be, start to be, assumed a little bit. It'll be like I'm not even doing it this way because I'm opting into it. I'm doing it because I have to because the rest of my stack just works -- I need to run something on a server-ish level or a CDN-ish level and these are my only options, so that's what I'm going to do. But that opens up your mind.
What's cool is, like I said, they just work all day. You're not thinking about the server that they're running on. It's ridiculously cheap. It's ridiculously fast. It's ridiculously secure.
Chris: That's why it's a paradigm shift or whatever, not to use a terrible word like that.
Dave: Well, so you're kind of coming up with rules for when you'd use what kind of worker, sort of?
Chris: Kind of because it's always better.
Dave: That's fair.
Chris: You know?
Dave: You know like workers over--whatever--main thread, you know. Any time, almost any time, you can use a worker. It's better, right?
Chris: Yeah, but I'm sure there are nuance like literacy was a word you brought up earlier that I liked. Does everybody know how this stuff works? Not everybody. There's learning curve stuff here.
Dave: No, well, and it takes the prerequisite learning block is async code, right? That's hard stuff. It catches me every day. It's just like, why is this resolving as nothing but then the console logged something out? I don't understand.
Chris: Mm-hmm. Mm-hmm.
Dave: You're just like, I don't get that. But it's because of latency or async programming. Yeah.
Chris: Yeah, that's no fun. Yeah, I was going to connect this to -- it's like the best tool for the job, kind of, you know, and I just like watching how the industry evolved.
You know we had -- yeah. [Laughter]
Dave: Gerry McGovern?
I was kind of trying to have a think about that. I think it's more complicated than that. I don't know that you could just say that because, like everything in the world, it depends.
That generally maps to performance. It generally maps to accessibility. And it generally maps to power consumption as well, which is fascinating.
Dave: Oh, that's not only rule of least power but actual power. [Laughter] Actual compute time. That's sort of what it comes down to. I think there was that kind of Great Gatsby -- ooh, I didn't mean that. The Great Gatsby debacle of the summer where it was like, oh, these build times take 30 minutes. My eyes just went so wide because 30 minutes of computer time is infinity for how many flops these things florp.
Dave: That's some many computes, you know. I understand what it is. It's like image compression and cutting different images. Again, not technology shaming so much, but it was just interesting to me that sometimes we think, "Oh, computer, go." But there's a real-world ramification to some of it.
Chris: Mm-hmm. That really is taking a lot of electricity too, for sure.
Then how much power really is that? Well, how do you factor in your monitor? What if that one of those animations was on white and one of them was on black dark pixels just because of by virtue of what you painted? Well, bright pixels just take a lot more power to paint, so you've got to factor that in too.
Chris: I started thinking about, because of the Edge Worker stuff, it's actually far more energy-efficient to send a file less distance. A CDN is more efficient in that way but, also, every time you push, your image is propagated to 100 servers all over the world and they're stored on 100 servers around the world. That costs electricity too, so you're incurring costs there before you save it.
Dave: Right, but that -- yeah. No, true. But that's a one-time cost, maybe. Probably not.
Dave: Versus a million times every second cost.
Chris: Exactly. That's why there are carbon calculator websites that you type your daverupert.com into it and it tells you. It tries to factor in traffic, too.
Chris: If your only goal in the world is using the least electricity on your website, well, then, delete your website. [Laughter]
Chris: The best website is nothing. The more traffic you get, the worst.
Dave: No. That's a problem, too, right? The reason you got to BuzzFeed levels was because you were putting 24 meg GIFs on it. You know? So, what do we do?
[Banjo music starts]
Chris: This episode of ShopTalk Show is brought to you in part by Framer. Literally, framer.com/shoptalk, go there to sign up for free or to get 20% off if you go for any of their paid plans, which is super cool. Thanks, Framer.
Good design work, it should clearly communicate a message. The same is true for you, a designer. You should be clearly communicating a message. To whom? To the people you need to sign off on your design: your manager, your client, your CEO.
Why are you still presenting flat, lifeless product ideas? Put an interactive prototype into the hands of those people and watch their eyes light up. They'll buy into your vision. It's much more convincing to present an interactive prototype. Framer is your secret weapon.
Start from scratch or import work from another design tool. Drag and drop. Build these powerful, interactive components. Set up transitions, animations, which is almost expected these days that your app has when it's moving from one page to the next or in one state to another state that it has transitions to get you there. Create your own stunning animations, all visually. It's a Web app, you know, beautiful for it.
Rich, realistic prototyping just became intuitive in Framer. Just take it from Bruno, a product designer at Shutterstock. "Framer is an extraordinary resource for rapid prototyping. It has improved our team collaboration by providing an easy way to share the designs between engineers and project managers. Its built-in tools have been essential for quick prototyping and user testing."
Oh, how awesome is that? Again, that's framer.com/shoptalk where you can sign up for free or get 20% off any of their paid plans.
[Banjo music stops]
Chris: Aye yie yie. Okay, let's see. Let's just do one really quick here by Rory O'Connor. Question right in. Thanks. People, write in questions. We love it.
Rory writes in, "How do you make peace with production code that is written by your naive, earlier self? Usually, there's no urgent reason to rewrite it, and doing so may introduce bugs but the vain part of me doesn't like that there is this newb looking code out there."
Dave: Hmm. Well, I think I take the "long hair don't care" strategy.
Dave: But I understand this because you want work to be work you're proud of. I think you could always write a to-do comment. I know that's dumb. You could just say, this is probably not as efficient as it could be or something. Just kind of own up to it so when future you or people ten years from now who aren't related to you or don't know you at all, they pick up that code and maybe they can say, "Well, at least that person tried." [Laughter] You know?
Chris: Yeah. Yeah.
Dave: The hard this is no one pays for your boss or your whatever. No one pays to fix up old code, working code. No one really wants to do that unless you hit a bottleneck of some kind.
Chris: Yeah, it's just an actual problem. You know? I think I heard David Khourshid say this one. He's like, "I don't call it legacy code. I call it legendary code."
Chris: It's just kind of a fun way to kind of pay homage or something, you know, like bow down to that code. Even though you think it sucks now, guess what it's been doing for a long time. Working.
Dave: Yeah. Yeah. Well, and again, it's not a problem. It's working. It's good code. It's up. Any code could be better.
Dave: But then it's this Adonis Factor almost, right? You have this perfect idealized hot man code, you know?
Dave: That's like your dream state code body and, guess what, most people don't achieve that or whatever. It's hard to do, so I think you just have to almost use that as the -- you know, get a notebook. I've been doing dev journaling. I could talk about that for another half hour.
Dave: Open up a dev journal. Seriously do this. Just write down the problems you see in it. Make a blog post. That's a great blog post. I wrote this code five years ago. Here's what I don't like about it. Just blog it. That'll trigger you so you don't make that mistake next time around.
Dave: Maybe you can help somebody else from making that mistake. Maybe that's the best way to extract value out of this old code. It's just like, here's something I did. Here's what I don't like about it.
Chris: I like it. Cool. I do want to hear more about that at some point. It always impresses me that people are able to write and not publish it. I have a very bad trigger finger with the writing and hitting go.
Dave: Yeah. I mean more of it is like day-to-day.
Chris: It's too boring for a blog post.
Dave: Too boring. Too personal. Too, I was in a meeting and I cried. You know? There's too much of that stuff. No one wants to read that.
Chris: All right, man. We are going to pass this off now to our friend Jeremey Keith who is reading aloud, dictating some of the posts from Jay Hoffmmann's Web history series on my blog CSS-Tricks. Now, Jay publishes a newsletter about Web history and has a site about it with this beautiful timeline. He has really gone deep, deep, deep, deep, deep into learning about and chronically and saving Web history, like what happened in the early days of the Web. Who was involved?
There are lots of angles to it. It's almost too big of a topic to talk about just one way, like, well, there was Tim Berners-Lee and then he made the Web. That's how he's going to start, so buckle your seatbelt for some Tim Berners-Lee, but stay tuned for future episodes where things -- there are different players involved, different things happening. It's very interesting stuff, so thanks to Jay for writing this series and thanks to Jeremy for dictating it for us.
[Banjo music plays]
Jeremy Keith: So, Tim Berners-Lee is fascinated with information. It has been his life's work. For over four decades, he has sought to understand how it is mapped and stored and transmitted, how it passes from person-to-person, how the seeds of information become the roots of dramatic change.
It is so fundamental to the work that he has done that when he wrote the proposal for what would eventually become the World Wide Web, he called it "Information Management: A Proposal." Information is the Web's core function. A series of bytes stream across the world and at the end of it is knowledge.
The mechanism for this transfer, what we know as the Web, was created by the intersection of two things. The first is the Internet, the technology that makes it all possible. The second is hypertext, the concept that grounds its use. They were brought together by Sir Tim Berners-Lee. When he was done, he did something truly spectacular. He gave it away to everyone to use for free.
When Tim Berners-Lee submitted "Information Management: A Proposal" to his superiors, they returned it with a comment on the top that read simply, "Vague, but exciting."
The Web wasn't a sure thing. Without the hindsight of today, it looked far too simple to be effective. In other words, it was a hard sell. Berners-Lee was proficient at many things but he was never a great salesman. He loved his idea for the Web, but he had to convince everybody else to love it too.
Sir Tim Berners-Lee has a mind that races. He has been known, based on interviews and public appearances, to jump from one idea to the next. He is almost always several steps ahead of what he is saying, which is often quite profound. Until recently, he only gave a rare interview here and there and masked his greatest achievements with humility in the wry British wit.
What is immediately apparent is that Sir Tim Berners-Lee is curious, curious about everything. It has led him to explore some truly revolutionary ideas before they became truly revolutionary. But it also means that his focus is typically split. It makes it hard for him to hold onto things in his memory. "I'm certainly terrible at names and faces," he once said in an interview.
His original fascination with the elements for the Web came from a very personal need to organize his own thoughts and connect them together, disparate and unconnected as they are. It is not at all unusual that when he reached for a metaphor for that organization, he came up with "a web."
As a young boy, his curiosity was encouraged. His parents, Conway Berners-Lee and Mary Lee-Woods were mathematicians. They worked on the Ferranti Mark 1, the world's first commercially available computer in the 1950s.
They fondly speak of Berners-Lee as a child taking things apart, experimenting with amateur engineering projects. There was nothing that he didn't seek to understand further. Electronics and computers specifically were particularly enchanting.
Berners-Lee sometimes tells the story of a conversation he had with his father as a young boy about the limitations of computers making associations between information that was not intrinsically linked. "The idea stayed with me that computers could be much more powerful," Berners-Lee recalls. "If they could be programmed to link otherwise unconnected information, in an extreme view the world can be seen as only connections."
He didn't know it yet, but Berners-Lee had stumbled upon the idea of hypertext at a very early age. It would be several years before he would come back to it.
History is filled with attempts to organize knowledge. An often-cited example is the Library of Alexandria, a fabled library of Ancient Greece that was thought to have had tens of thousands of meticulously organized texts. At the turn of the century, Paul Otlet tried something similar in Belgium. The project was called Repertoire Bibliographique Uiversel, Universal Bibliography.
Otlet and a team of researchers created a library of over 15 million index cards, each with a discrete and small piece of information in topics ranging from science to geography. Otlet devised a sophisticated numbering system that allowed him to link one index card to another. He fielded requests from researchers around the world via mail or telegram and Otlet's researchers could follow a trail of linked index cards to find an answer. Once properly linked, information becomes infinitely more useful.
A sudden surge of scientific research in the wake of WWII prompted Vannevar Bush to propose another idea. In his groundbreaking essay in The Atlantic in 1945 entitled "As We May Think," Bush imagined a mechanical library called a Memex. Like Otlet's universal bibliography, the Memex stored bits of information. But instead of index cards, everything was stored on compact microfilm.
Through the process of what he called associative indexing, users of the Memex could follow trails of related information through an intricate Web of links.
The list of attempts goes on, but it was Ted Nelson who finally gave the concept a name in 1968, two decades after Bush's article in The Atlantic. He called it hypertext. Hypertext is essentially linked text.
Nelson observed that, in the real world, we often give meaning to the connections between concepts. It helps us grasp their importance and remember them for later. The proximity of a Post-It to your computer, the orientation of ingredients in your refrigerator, the order of books on your bookshelf. Invisible though they may seem, each of these signifiers hold meaning, whether consciously or subconsciously, and they are only fully realized when taking a step back.
Hypertext was a way to bring those same kinds of meaningful connections to the digital world. Nelson's primary contribution to hypertext is a number of influential theories and a decades-long project still in progress known as Xanadu. Much like the Web, Xanadu uses the power of a network to create a global system of links and pages. However, Xanadu puts a far greater emphasis on the ability to trace text to its original author for monetization and attribution purposes. This distinction known as transclusion has been a near-impossible technological problem to solve.
Nelson's interest in hypertext stems from the same issue with memory and recall as Berners-Lee. He refers to it as his hummingbird mind. Nelson finds it hard to hold onto associations he creates in the real world. Hypertext offers a way for him to map associations digitally so that he call on them later.
Berners-Lee and Nelson met for the first time a couple of years after the Web was invented. They exchanged ideas and philosophies, and Berners-Lee was able to thank Nelson for his influential thinking. At the end of the meeting, Berners-Lee asked if he could take a picture. Nelson in turn asked for a short video recording. Each was commemorating the moment they knew they would eventually forget and each turned to technology for a solution.
By the mid-'80s, on the wave of innovation and personal computing, there were several hypertext applications out in the wild. The hypertext community, a dedicated group of software engineers that believed in the promise of hypertext, created programs for researchers, academics, and even off-the-shelf personal computers. Every research lab worth their weight in salt had a hypertext project. Together, they built entirely new paradigms into their software, processes, and concepts that feel wonderfully familiar today but were completely outside the realm of possibilities just a few years earlier.
At Brown University, the very place where Ted Nelson was studying when he coined the term hypertext, Norman Meyrowitz, Nancy Garrett, and Karen Catlin were the first to breathe life into the hyperlink, which was introduced in their program Intermedia. At Symbolics, Janet Walker was toying with the idea of saving links for later, a kind of speed dial for the digital world, something she was calling a bookmark.
At the University of Maryland, Ben Shneiderman sought to compile and link the world's largest source of information with his interactive encyclopedia system. Dame Wendy Hall, at the University of Southampton, sought to extend the life of the link further in her own program, Microcosm. Each link made by the user was stored in a link base, a database apart from the main text specifically designed to store metadata about connection.
In Microcosm, links could never die, never rot away. If their connection was severed, they could point elsewhere since links weren't directly tied to text. You could even write a bit of text alongside links, expanding a bit on why the link was important, or add to a document separate layers of links. One, for instance, a tailored set of carefully curated references for experts on a given topic, the other a more laidback set of links for the casual audience.
There were mailing lists and conferences and an entire community that was small, friendly, fiercely competitive, and locked in an arms race to find the next big thing. It was impossible not to get swept up in the fervor.
Hypertext enabled a new way to store actual tangible knowledge. With every innovation, the digital world became more intricate and expansive and all-encompassing.
Then came the heavy hitters. Under a shroud of mystery, researchers and programmers at the legendary Xerox Park were building notecards. Apple caught wind of the idea and found it so compelling that they shipped their own hypertext application called Hypercard, bundled right into the Mac operating system.
If you were a late Apple II user, you likely have fond memories of Hypercard, an interface that allowed you to create a card and quickly link it to another. Cards could be anything: a recipe, maybe, or the prototype of the latest project. One by one, you could link those cards up visually and with no friction until you had a digital reflection of your ideas.
Towards the end of the '80s, it was clear that hypertext had a bright future. In just a few short years, the software had advanced in leaps and bounds.
After a brief stint studying physics at the Queens College Oxford, Sir Tim Berners-Lee returned to his first love, computers. He eventually found a short-term, six-month contract at the particle physics lab, Conseil Europeen pour la Recherche Nucleaire, European Council for Nuclear Research, or simply CERN.
CERN is responsible for a long line of particle physics breakthroughs. Most recently, they built the large hadron collider, which led to the confirmation of the Higgs boson particle, a.k.a. the God particle.
CERN doesn't operate like most research labs. Its internal staff makes up only a small percentage of the people that use the lab. Any research team from around the world can come and use the CERN facilities provided that they are able to prove their research fits within the stated goals of the institution. A majority of CERN occupants are from these research teams.
CERN is a dynamic, sprawling campus of researchers ferrying from location-to-location on bicycles or minecarts working on the secrets of the universe. Each team is expected to bring their own equipment and expertise. That includes computers.
Berners-Lee was hired to assist with software on an earlier version of the particle accelerator called The Proton Synchrotron. When he arrived, he was blown away by the amount of pure, unfiltered information that flowed through CERN. It was nearly impossible to keep track of it all and equally impossible to find what you were looking for.
Berners-Lee wanted to capture that information and organize it. His mind flashed back to that conversation with his father all those years ago. What if it were possible to create a computer program that allowed you to make random associations between bits of information? What if you could, in other words, link one thing to another?
He began working on a software project on the side for himself. Years later, that would be the same way he built the Web. He called this project Inquire, named for a Victorian handbook he had read as a child. Using a simple prompt, Enquire users could create a block of info, something like Otlet's index cards all those years ago. Just like the universal bibliography, Enquire allowed you to link one block to another. Tools were bundled in to make it easier to zoom back and see the connections between the links. For Berners-Lee, this filled a simple need. It replaced the part of his memory, that made it impossible for him to remember names and faces, with a digital tool.
Compared to the software being actively developed at the University of Southampton or at Xerox or Apple, Enquire was unsophisticated. It lacked a visual interface and its format was rudimentary. A program like Hypercard supported rich media and advanced two-way connections but Inquire was only Berners-Lee's first experiment with hypertext. He would drop the project when his contract was up at CERN.
Berners-Lee would go and work for himself for several years before returning to CERN. By the time he came back, there would be something much more interesting for him to experiment with. Just around the corner was the Internet.
Packet switching is the single most important invention in the history of the internet. It is how messages are transmitted over a globally decentralized network. It was discovered almost simultaneously in the late '60s by two different computer scientists, Donald Davies and Paul Baran. Both were interested in the way in which it made networks resilient.
Traditional telecommunications at the time were managed by what is known as circuit switching. With circuit switching, a direct connection is opened between the sender and the receiver. The message is sent in its entirety between the two. That connection needs to be persistent and each channel can only carry a single message at a time. That line stays open for the duration of a message and everything is run through a centralized switch.
If you're searching for an example of circuit switching, you don't have to look far. That's how telephones work or used to at least. If you've ever seen an old film or even a TV show like Mad Men where an operator pulls a plug out of a wall and plugs it back in to connect a telephone call, that's circuit switching, though that was all eventually automated. Circuit switching works because everything is sent over the wire all at once and through a centralized switch. That's what the operators are connecting.
Packet switching works differently. Messages are divided into smaller bits or packets and sent over the wire a little at a time. They can be sent in any order because each packet has just enough information to know where in the order it belongs. Packets are sent through until the message is complete and then reassembled on the other side.
There are a few advantages to a packet switch network. Multiple messages can be sent at the same time over the same connection, split up into little packets. Crucially, the network doesn't need centralization. Each node in the network can pass around packets to any other node without a central routing system. This made it ideal in a situation that requires extreme adaptability like in the fallout of an atomic war, Paul Baran's original reason for devising the concept.
When Davies began shopping around his idea for packet switching to the telecommunications industry, he was shown the door. "I went along to Siemens once and talked to them and they actually used the words -- they accused me of technical -- they were really saying that I was being impertinent by suggesting anything like packet switching. I can't remember the exact words, but it amounted to that. That I was challenging the whole of their authority."
Traditional telephone companies were not at all interested in packet switching but ARPA was. ARPA, later known as DARPA, was a research agency embedded in the United States Department of Defense. It was created in the throes of the Cold War, a reaction to the launch of the Sputnik satellite by Russia, but without a core focus. It was created at the same time as NASA, so launching things into space was already taken.
To adapt to their situation, ARPA recruited research teams from colleges around the country. They acted as a coordinator and mediator between several active university projects with a military focus. ARPA's organization had one surprising and crucial side effect. It was comprised mostly of professors and graduate students who were working at its partner universities. The general attitude was that as long as you could prove some sort of modest relation to military application, you could pitch your project for funding. As a result, ARPA was filled with lots of ambitious and free-thinking individuals working outside of a buttoned-up government agency with little oversight coming up with the craziest and most world-changing ideas they could.
"We expected that a professional crew would show up eventually to take over the problems we were dealing with," recalls Bob Kahn, an ARPA programmer critical to the invention of the Internet. "The professionals never showed up."
One of those professors was Leonard Kleinrock at UCLA. He was involved in the first stages of ARPANET, the network that would eventually become the Internet. His job was to help implement the most controversial part of the project, the still theoretical concept known as packet switching, which enabled a decentralized and efficient design for the ARPANET network. It is likely that the Internet would not have taken shape without it.
Once packet switching was implemented, everything came together quickly. By the early 1980s, it was simply called "the Internet." By the end of the 1980s, the Internet went commercial and global, including a node at CERN.
The first applications of the Internet are still in use today. FTP used for transferring files over the network was one of the first things built. Email is another one. It had been around for a couple of decades on a closed system already. When the Internet began to spread, email became networked and infinitely more useful.
Other projects were aimed at making the Internet more accessible. They had names like Archie, Gopher, and WAIS and have largely been forgotten. They were united by a common goal of bringing some order to the chaos of a decentralized system. WAIS and Archie did so by indexing the documents put on the Internet to make them searchable and findable by users. Gopher did so with a structured hierarchical system.
Kleinrock was there when the first message was ever sent over the Internet. He was supervising that part of the project and, even then, he knew what a revolutionary moment it was. However, he was quick to note that not everybody shared that feeling in the beginning.
He recalls the sentiment held by the titans of the telecommunications industry like the Bell Telephone Company. They said, "Little boy, go away," and so we went away.
Most felt that the project would go nowhere, nothing more than a technological fad. In other words, no one was paying much attention to what was going on and no one saw the Internet as much of a threat.
When that group of professors and graduate students tried to convince their higher-ups to let the whole thing be free, to let anyone implement the protocols of the Internet without a need for licenses or license fees, they didn't get much pushback. The Internet slipped into public use and only the true technocratic dreamers of the late 20th Century could have predicted what would happen next.
Berners-Lee returned to CERN in a fellowship position in 1984. It was four years after he had left. A lot had changed. CERN had developed their own network known as CERN Net. But by 1989, they arrived and hooked up to the new internationally standard Internet.
"In 1989, I thought," he recalls, "look, it will be so much easier if everybody asking me questions all the time could just read my database and it'll be so much nicer if I could find out what these guys are doing just by jumping into a similar database of information for them." Put another way, he wanted to share his own homepage and get a link to everyone else's.
What he needed was a way for researchers to share these databases without having to think much about how it all works. His way in, with management, was operating systems. CERN's research teams all bring their own equipment, including computers, and there's no way to guarantee they're all running the same OS. Interoperability between operating systems is a difficult problem by design, generally speaking. The goal of an OS is to lock you in.
Among its many other uses, the globally networked hypertext system like the Web was a wonderful way for researchers to share notes between computers using different operating systems. However, Berners-Lee had a bit of trouble explaining his idea. He's never exactly been concise.
By 1989, when he wrote "Information Management: A Proposal," Berners-Lee already had worldwide ambitions. The document has thousands of words filled with diagrams and charts. It jumps energetically from one idea to the next without fully explaining what's just been said. Much of what would eventually become the Web was included in the document but it was just too big of an idea. It was met with a lukewarm response, that "Vague, but exciting" comment scrawled across the top.
A year later, in May of 1990, at the encouragement of his boss, Mike Sendall, the author of that comment, Berners-Lee circulated the proposal again. This time, it was enough to buy him a bit of time internally to work on it. He got lucky. Sendall understood his ambition and aptitude. He wouldn't always get that kind of chance. The Web needed to be marketed internally as an invaluable tool. CERN needed to need it.
Taking complex ideas and boiling them down to their most salient marketable points, however, was not Berners-Lee's strength. For that, he was going to need a partner. He found one in Robert Cailliau.
Cailliau was a CERN veteran. By 1989, he worked there as a programmer for over 15 years. He'd embedded himself in the company culture, proving a useful resource helping teams organize their informational toolset and knowledge sharing systems. He had helped several teams at CERN do exactly the kind of thing Berners-Lee was proposing, though at a smaller scale.
Temperamentally, Cailliau was about as different from Berners-Lee as you could get. He was hyper-organized and fastidious. He knew how to sell things internally and he'd made plenty of political inroads at CERN.
What he shared with Berners-Lee was an almost insatiable curiosity. During his time as a nurse in the Belgian military, he got fidgety. "When there was slack at work, rather than sit in the infirmary twiddling my thumbs, I went and got myself some time on the computer there." He ended up as a programmer in the military working on wargames and computerized models. He couldn't help but look for the next big thing.
In the late '80s, Cailliau had a strong interest in hypertext. He was taking a look at Apple's Hypercard as a potential internal documentation system at CERN when he caught wind of Berners-Lee's proposal. He immediately recognized its potential.
Working alongside Berners-Lee, Cailliau pieced together a new proposal, something more concise, more understandable, and more marketable. While Berners-Lee began putting together the technologies that would ultimately become the Web, Cailliau began trying to sell the idea to interested parties inside of CERN. The Web, in all of its modern uses and ubiquity, can be difficult to define as just one thing. We have the Web in our refrigerators now.
In the beginning, however, the Web was made up of only a few essential features. There was the Web server, a computer wired to the Internet that can transmit documents and media, webpages, to other computers. Webpages are served via HTTP, a protocol designed by Berners-Lee in the earliest iterations of the Web.
HTTP is a layer on top of the Internet and was designed to make things as simple and resilient as possible. HTTP is so simple that it forgets a request as soon as it has made it. It has no memory of the webpages it served in the past. The only thing HTTP is concerned with is the request it's currently making. That makes it magnificently easy to use.
These webpages are sent to browsers, the software that you're using to read this article. Browsers can read documents handed to them by servers because they understand HTML, another early invention of Sir Tim Berners-Lee. HTML is a markup language. IT allows programmers to give meaning to their documents so that they can be understood.
The H in HTML stands for hypertext. Like HTTP, HTML, all of the building blocks programmers can use to structure a document wasn't all that complex, especially when compared to other hypertext applications at the time. HTML comes from a long line of other similar markup languages, but Berners-Lee expanded it to include the link in the form of an anchor tag. The A tag is the most important piece of HTML because it serves the Web's greatest function: to link together information.
The hyperlink was made possible by the Universal Resource Identifier, URI, later renamed to the Uniform Research Indicator after the IETF found the word "universal" to be a bit too substantial. But for Berners-Lee, that was exactly the point. "Its universality is essential: the fact that a hyperlink can point to anything, be it personal, local or global, be it draft or highly polished," he wrote in his personal history of the Web.
Of all the original technologies that made up the Web, Berners-Lee and several others have noted that the URL was the most important. By Christmas of 1990, Sir Tim Berners-Lee had all of that built. A full prototype of the Web was ready to go.
Cailliau, meanwhile, had had a bit of success trying to sell the idea to his bosses. He hoped that his revised proposal would give him a team and some time. Instead, he got six months and a single staff member, intern Nicola Pellow.
Pellow was new to CERN, on placement for her mathematics degree. But her work on the Line Mode Browser, which enabled people from around the world using any operating system to browse the Web, proved a crucial element in the Web's early success. Berners-Lee's work, combined with the Line Mode Browser, became the Web's first set of tools. It was ready to show to the world.
When the team at CERN submitted a paper on the World Wide Web to the San Antonio Hypertext Conference in 1991, it was soundly rejected. They went anyway and set up a table with a computer to demo it to conference attendees.
One attendee remarked, "They have chutzpah calling that the World Wide Web!"
The highlight of the Web is that it was not at all sophisticated. Its use of hypertext was elementary, allowing for only simplistic text-based links. And without two-way links, pretty much a given in hypertext applications, links could go dead at any minute. There was no link base or sophisticated metadata assigned to links. There was just the anchor tag.
The protocols that ran on top of the Internet were similarly basic. HTTP only allowed for a handful of actions, and alternatives like Gopher or WAIS offered far more options for more advanced connections through the Internet network.
It was hard to explain, difficult to demo, and had overly lofty ambition. It was created by a man who didn't have much interest in marketing his ideas. Even the name was somewhat absurd. WWW is one of only a handful of acronyms that actually takes longer to say than the full "World Wide Web."
We know how this story ends. The Web won. It's used by billions of people and runs through everything we do. It is among the most remarkable technological achievements of the 20th Century.
It had a few advantages, of course. It was instantly global and widely accessible thanks to the Internet. And the URL, and its uniqueness, is one of the more clever concepts to have come from networked computing.
But if you want to truly understand why the Web succeeded, we have to come back to information. One of Berners-Lee's deepest held beliefs is that information is incredibly powerful, and that it deserves to be free. He believed that the Web could deliver on that promise. For it to do that, the Web would need to spread.
Berners-Lee looked to his successors for inspiration: the Internet. The Internet succeeded, in part, because they gave it away to everyone. After considering several licensing options, he lobbied CERN to release the Web unlicensed to the general public. CERN, an organization far more interested in particle physics breakthroughs than hypertext, agreed.
In 1993, the Web officially entered the public domain and that was the turning point. They didn't know it then, but that was the moment the Web succeeded, when Berners-Lee was able to make globally available information truly free.
In an interview some years ago, Berners-Lee recalled how it was that the Web came to be. "I had the idea for it. I defined how it would work. But it was actually created by people."
This may sound like humility from one of the world's great thinkers, and it is that a little, but it is also the truth. The Web was Berners-Lee's gift to the world. He gave it to us, and we made it what it was. He and his team fought hard at CERN to make that happen.
Berners-Lee knew that with the resources available to him, he would never be able to spread the Web sufficiently outside of the hallways of CERN. Instead, he packaged up all the code that was needed to build a browser into a library called libwww and posted it to a Usenet group. That was enough for some people to get interested in browsers. But before browsers would be useful, you needed something to browse.
[Banjo music plays]
Dave: All right. Thanks again. That was wonderful and it has me thinking a lot about my very first experience on the Web, like when hit the Web. You know? That was like my big -- I don't know. Go write blog posts about your Web experience. That would be good. Anyway, cool. Chris, do you have anything else you want to say?