The url imports and binary compilation sounds like a viable alternative to what Golang offers, having build tooling included makes it so much more approachable to beginners.
~10 years later when DeDeno comes out, "it's so refreshing to see something that looks like it was written from a clean slate without the burden of all the previous mistakes"
I’m already using deno to write things I would normally write with bash otherwise. The cold start is not as fast but typescript is 100x more ergonomic than bash
Tried it last year and was really happy with it however the http module wasn't ready yet so I need to check progress on that front. Also need to see how well SSR can be done with react or svelte and I might just start pushing the change for the companies I work with. Last year was still young might be ripe now.
just saw the title and I was mega exited - I saw the talk live and was like wow - played around with it since then over and over again and honestly looking forward to when I get the chance to use it in production. Also I will call it deno not dino - sorry first impression counts 😬
So we host the source of our TypeScript packages on a web URL, that's great. My question is, does Deno use the typescript config of the imported package and compile it differently to the local package? Say the remote imported package is "strict: false" and your local project is "strict: true". When importing source directly, TypeScript will compile the imported project source using the local tsconfig, which will fail if the imported source violates the local configuration (as would be the case in my above example). If so, does that same rule apply for importing local folders?
It’s been a while since I played with Deno but if I remember correctly you don’t get much access to typescript flags like you normally would with tsconfig.json. I guess the idea is that you can’t encounter issues with different configuration if you force defaults. Really not sure about what happened to tsconfig.json. If it’s gone in Deno I’ll miss it since I liked to set some of my flags to the strictest settings.
You've convinced me to give Deno a second look. I saw the import from URL business and thought how on earth could that be OK for security? What if the URL-owner hotswaps the code? (Wish you addressed that directly actually) but with lack of permissions, it's really not that big of a concern, is it? What if they go offline though, are you hooped? At least with node/npm there's a central repository to which there's probably a handful of mirrors out there. Yarn 2 and others are taking it a step further by recommending you actually check in your dependencies. How do 'bundle' and 'compile' work with dynamic imports?
> What if the URL-owner hotswaps the code ... it's really not that big of a concern, is it? Correct! If a script runs without permissions, it is sandboxed, like in the browser. You have to not just trust a module, but also the registry. If over time more authors will publish their modules on their own domains, this will result in this trust converging on a single entity per module. > What if they go offline though, are you hooped? If you are concerned about this, you should use mirrors or check in source code. You could write a tool that uses import maps to rewrite all module specifiers to a local folder on disk, or a local proxy server.
Re sites going offline, it isn't a big deal in practice because most of your deps are hosted on a static CDN like deno.land/x or transpiling CDN like esm.sh. While you theoretically could import from anywhere on the web, mostly you just don't 😛 It's sorta like git: having the option to keep everything decentralized is great, but we've basically all converged on using a centralised service to host our repositories. Recently there was a bug in esm.sh that seemed to only affect one package I wanted, and I didn't even notice locally because of the import caching! I only found out when CI started failing, so I raised an issue on GitHub and the author fixed it in like an hour.
@@deno_land > If over time more authors will publish their modules on their own domains, this will result in this trust converging on a single entity per module An entity per module sounds OK. Over time I've come to figure out which authors produce high-quality npm modules, but that doesn't mean I believe they're good stewards of keeping their domain running 24/7 (or they can handle a big influx of requests). > you should use mirrors or check in source code Does Deno help at all with this? Do I have to manually download each URL I want to use, come up with my own conventions for where I put 3rd party libs, and what not? That sounds like re-inventing node_modules unless maybe I can choose to have Deno cache into the project dir and just check that in.
@@mpenatwork You have the option of using either `HTTPS_PROXY` env var to set up a transparent proxy, or rewriting all `` URLs with an import map. The latter is actually relatively trivial.
Another thing, how are common dependencies handled? Like if two different packages depend on React (let's pretend I'm doing server-side rendering)? I assume they'll each pull in their own copy of React unless they just *happen* to use the exact same URL (and assuming Deno does the one-instance-per-URL thing like Node --- which also raises the question of URL normalization). Sometimes this is a problem if the lib needs a single instance, sometimes it isn't (like two copies of lodash should be fine). But with npm/yarn we can use "resolutions" to force a single copy, in Deno, are we out of luck?
Import maps can help I think, assuming module specifier is simply “react”. And I think a single URL repeated in the same runtime is one instance per ES module spec.
@@arbitraryCharacters Can you map one URL to another URL? I'm thinking about scenarios where I don't control the `import` line because it's buried in some lib. Can import-maps use wild cards? It would be kind of annoying to have to put 20 different mappings for the same lib in there because they each have a slightly different patch version.
Just wondering how how tool like ESLint would work? I assume ESLint itself isn't compatible atm with deno due to it relying on node's standard library. Bit if it was made compatible, how would run the linting jobs?
1. Nothing stops you from also using node to run eslint on a Deno project! They're just different tools that solve different problems. Discordeno, the most popular Deno-first discord library, uses Node to run prettier. 2. Deno has a built in linter that (iirc) follows eslint's defaults, `deno lint`. It uses SWC so it's blazing fast too.
You should use `typeof document !== "undefined"` to detect DOM, and for everything else you should test for each feature individually. Probing for `window` breaks down in workers.
Though that does not support trailing slashes. Say /packageV2.2/ contains 50 different files with different functionality, with re-exports you'd need 50 imports, with import maps you need 1.
@@emilemil1 You'd need 50 imports in both cases, look it up What import maps achieve is either shorten the import specifiers (not actual imports) or allow you to change deep nested import origins
@@Soremwar With maps you can make a single alias for the package and its version, making package management simple. It's only one line to change if you want to update the version, change host, etc. With re-exports you need to specify the entire path, with version, for every individual module in the package that you want to import. For 50 modules that's 50 paths compared to 1 import map alias. It's bloated and it's unwieldy to edit without relying on a tool. It becomes even more advantageous if you need to support different versions of a package in the same project, with maps supporting scopes and not requiring a new alias per version, while re-exports require duplication of all imports and a new alias. To be extra clear, there is no significant difference outside of the files defining the aliases. An import map is more concise and easier to maintain compared to a file with re-exports.
@@emilemil1 I don't know if you are confused or you haven't actually used deps.ts vs import maps but I can guarantee you, reexports aren't that cumbersome compared to import maps. This is the most straightforward case I could think of to compare them import { React } from "./deps.ts"; // Regular imports import React from " react"; // With an import map
Accessibility Note For Future Videos: When referring to something written on the screen, it would be great if you could read out that content for the benefit of listeners.
@@spartanatreyu If you are referring to browsers and web standards in general sure. HTTP/2 Server Push has recently been removed right? In our case we have been using it to power a number of fortune 500 companies products for over a year now and has lead to significant improvements in conversion along with lighthouse 100 scores 🎇for bragging rights.
@@SimonBuchanNz we are very aware and not had any more issues with upgrading than any other language or framework this last year. If anything Oak, version 9 now, is a bigger source of change. Its no different with node we dont ban usage of express (over 10) or Jest (over 40) because they have dependencies that are version "0.x.x"
Typing URLs for every package is pretty ugly. I predict they will add some package.json eventually. Or is everybody supposed to create their own `package.ts` and re-export everything?
@@callthecapital8631 Thanks. My intuition was correct then. I personally still don't like having to re-implement package.json in every project. :) I feel there should be a way to re-use the package.json model and improve it with additional security. Something that would lock-down "lodash" from accessing my filesystem. :)
With the permissions and other features (like import-maps) that you have to pass by the command line, I imagine there will be an upsurge of having something like a .sh/.bat to run the script with all the command line arguments that are needed.
At the moment I prefer Node, for many reasons. For example, I prefer the package manager; I use ReScript and I don’t understand why I have to install a TypeScript compiler that I don’t need.
The question is: which intrinsic problem does deno solves that nodejs doesn't and is a deal breaker for the business specs? My underlying thought is: somebody got tired or felt "stagnant" of a tech that got stable and he needed to invent something else. These kind of experiments are good for the creative mind, but released into the wild not every person has the right judgement to know which is the correct tool to use on any given project. Specially people who want to be cool, accepted on the group of edge tech or just want to show off that they know this new thing.
That “somebody” who start the Deno project is Ryan Dahl, who *also* invented Node. I can only recommend the JSConf.EU talk he gave (link in the description) with a rundown of why he thought it’s worth starting over.
@@dassurma yes I'm perfectly aware it is the same person and because of that people fall into the authority figure "cult", follow blindly someone just because they are famous or did something great once. If deno doesn't solve a business road block that node can't, then there's no point on changing tools other than: look at me, I'm using "cool" new tech and we have to tear down the whole app to build it again. (reminds me of the term hipster)
@@draco_2727 considering that Ryan is notorious for not having social media and generally flying under radar, the "look at me" angle doesn't make sense to me. The reasons he gives in his talk do make sense to me tho. 🤷♂️ After 10+ years of node, the JS language, the npm ecosystem and our knowledge about backend JS have progressed considerably. Its hard to incorporate these lessons learned and new language constructs when you have an entire Standard library with backwards compatibility.
@@dassurma that "look at me" was meant for the users, not the creator. If you can't incorporate lessons learned isn't that somehow failure of the architect(ur)? Is there a new #C response to whatever shortcomings C# had or has, for instance. To be clear, creative minds have to satiate their need for experimenting and creating; in this context the creation and exposure of these tools fall in the hands of millions of developers who fall in a wide range of knowledge, experience, insecurities, peer pressures, etc. Which directly or not result in devs hating JS, being tired of front end development, company's job post that require a unicorn developer to match their tech stack, and so on. I'm at my 19nth year of web development and something I've learned (that's not new) is to keep things simple in the small and grand scheme of things. PS: please take into account that its hard to have a reasonable, sane and productive discussion or conversation just through text like this.
We are working on an up to date one! Should be in the manual soon. The general concept Surma explained is correct though, even if the image is a little dated.
@@jakearchibald Haha, your Danone reference isn't wasted on me! Probably not obvious from my comment, but I agree with you guys. It's 'dee-no', surely :) Absolutely love you guys and the content ❤
Deno and node should just merge and work together. I know it’s “impossible possible”, but if there’s the possibility, I think it’s worth making it happen.
We have a manual to get you started: deno.land/manual@v1.13.1/introduction. You can also hop on Discord if you get stuck on anything specific: discord.gg/deno
Surma said Deno 28 times and Deno 5 times.
For second there I thought you were being helpful.
I read that as Dee-no and Den-o and was like "Oh ya, that sounds about right"
This is amazing. People are staying away from Deno right now because of stability and because the industry doesn't use it. I love it though!
YES! i've been waiting for some legit javascript influencers to start promoting deno
Maybe Ryan Dahl could endorse it
Loving Deno so far, to my surprise contributing to projects in Rust has been easier to me as well! Great episode.
Deno is just lovely 🥰
The url imports and binary compilation sounds like a viable alternative to what Golang offers, having build tooling included makes it so much more approachable to beginners.
The original implementation was based on golang instead of rust - and at least that's my opinion Ryan took a lot of inspiration from golang
Damn cli mode for alert, confirm, prompt is so nice for interactive mode of clis. Loved it.
Its so refreshing to see something that looks like it was written from a clean slate without the burden of all the previous mistakes.
~10 years later when DeDeno comes out, "it's so refreshing to see something that looks like it was written from a clean slate without the burden of all the previous mistakes"
My fav show's backkk! 🥰🥰 Thanks both!
I’m already using deno to write things I would normally write with bash otherwise.
The cold start is not as fast but typescript is 100x more ergonomic than bash
Would be interesting to see how they will manage the typescript compatibility issues, as ts has not been very backward or forward compatible.
Last time I was this early Denos still roamed the land. Great episode guys, loving seeing you in the same room again!
Deno is pure joy! I'm writing a RSS/Atom Feed deserializer-lib using Deno and having a blast with it. 😊
Tried it last year and was really happy with it however the http module wasn't ready yet so I need to check progress on that front. Also need to see how well SSR can be done with react or svelte and I might just start pushing the change for the companies I work with. Last year was still young might be ripe now.
Everything is back to _normal_
^ Uncaught ReferenceError: normal is not defined
Laughed way too hard when Surma pulled out the ruler band.
Yes we've missed it. Great episode thanks
I feel package management is somewhat inspired from GoLang syntax and added there version favour to it
just saw the title and I was mega exited - I saw the talk live and was like wow - played around with it since then over and over again and honestly looking forward to when I get the chance to use it in production.
Also I will call it deno not dino - sorry first impression counts 😬
Deno makes more sense too, from noDe
Should I start pronouncing Node as "Nodi" from now on? 😱
do you pronounce sudo as Sue Doo?
On 7:36 I hoped that comlink is now available in deno.land, but that doesn’t seem to be the case. Is the GitHub version compatible with Deno?
If you use any of the npm CDNs (skypack, unpkg,...) you can use Comlink in Deno today. It was mentioned in their 1.12 announcement blogpost!
So when are the Deno compatible Google APIs be released?
So we host the source of our TypeScript packages on a web URL, that's great. My question is, does Deno use the typescript config of the imported package and compile it differently to the local package?
Say the remote imported package is "strict: false" and your local project is "strict: true". When importing source directly, TypeScript will compile the imported project source using the local tsconfig, which will fail if the imported source violates the local configuration (as would be the case in my above example).
If so, does that same rule apply for importing local folders?
It’s been a while since I played with Deno but if I remember correctly you don’t get much access to typescript flags like you normally would with tsconfig.json. I guess the idea is that you can’t encounter issues with different configuration if you force defaults. Really not sure about what happened to tsconfig.json. If it’s gone in Deno I’ll miss it since I liked to set some of my flags to the strictest settings.
That moment I realized that "Deno" is an anagram of "Node".. I see what you did Mr. Dahl!
in fact it's not only anagram, it's a great game of words: 'node'.split('').sort().join('')
@@yevheniiherasymchuk probably how we came up with it lol
Great overview - thanks, guys!
You've convinced me to give Deno a second look. I saw the import from URL business and thought how on earth could that be OK for security? What if the URL-owner hotswaps the code? (Wish you addressed that directly actually) but with lack of permissions, it's really not that big of a concern, is it? What if they go offline though, are you hooped? At least with node/npm there's a central repository to which there's probably a handful of mirrors out there. Yarn 2 and others are taking it a step further by recommending you actually check in your dependencies.
How do 'bundle' and 'compile' work with dynamic imports?
> What if the URL-owner hotswaps the code ... it's really not that big of a concern, is it?
Correct! If a script runs without permissions, it is sandboxed, like in the browser. You have to not just trust a module, but also the registry. If over time more authors will publish their modules on their own domains, this will result in this trust converging on a single entity per module.
> What if they go offline though, are you hooped?
If you are concerned about this, you should use mirrors or check in source code. You could write a tool that uses import maps to rewrite all module specifiers to a local folder on disk, or a local proxy server.
Re sites going offline, it isn't a big deal in practice because most of your deps are hosted on a static CDN like deno.land/x or transpiling CDN like esm.sh. While you theoretically could import from anywhere on the web, mostly you just don't 😛
It's sorta like git: having the option to keep everything decentralized is great, but we've basically all converged on using a centralised service to host our repositories.
Recently there was a bug in esm.sh that seemed to only affect one package I wanted, and I didn't even notice locally because of the import caching! I only found out when CI started failing, so I raised an issue on GitHub and the author fixed it in like an hour.
@@deno_land > If over time more authors will publish their modules on their own domains, this will result in this trust converging on a single entity per module
An entity per module sounds OK. Over time I've come to figure out which authors produce high-quality npm modules, but that doesn't mean I believe they're good stewards of keeping their domain running 24/7 (or they can handle a big influx of requests).
> you should use mirrors or check in source code
Does Deno help at all with this? Do I have to manually download each URL I want to use, come up with my own conventions for where I put 3rd party libs, and what not? That sounds like re-inventing node_modules unless maybe I can choose to have Deno cache into the project dir and just check that in.
@@mpenatwork You have the option of using either `HTTPS_PROXY` env var to set up a transparent proxy, or rewriting all `` URLs with an import map. The latter is actually relatively trivial.
Happy to see you again 👏👏👏👍
I guess we know what we need next 19:20 (a video on WebGPU).
No one’s going to mention the force field generators? Or are those tanning lamps?
They are anti virus lights.
@@dassurma can't tell if joking: UV totally is effective against viruses, but those don't look like any I've seen?
Another thing, how are common dependencies handled? Like if two different packages depend on React (let's pretend I'm doing server-side rendering)? I assume they'll each pull in their own copy of React unless they just *happen* to use the exact same URL (and assuming Deno does the one-instance-per-URL thing like Node --- which also raises the question of URL normalization). Sometimes this is a problem if the lib needs a single instance, sometimes it isn't (like two copies of lodash should be fine). But with npm/yarn we can use "resolutions" to force a single copy, in Deno, are we out of luck?
Import maps can help I think, assuming module specifier is simply “react”. And I think a single URL repeated in the same runtime is one instance per ES module spec.
@@arbitraryCharacters Can you map one URL to another URL? I'm thinking about scenarios where I don't control the `import` line because it's buried in some lib. Can import-maps use wild cards? It would be kind of annoying to have to put 20 different mappings for the same lib in there because they each have a slightly different patch version.
It's deno, not deno.
Just wondering how how tool like ESLint would work?
I assume ESLint itself isn't compatible atm with deno due to it relying on node's standard library.
Bit if it was made compatible, how would run the linting jobs?
1. Nothing stops you from also using node to run eslint on a Deno project! They're just different tools that solve different problems.
Discordeno, the most popular Deno-first discord library, uses Node to run prettier.
2. Deno has a built in linter that (iirc) follows eslint's defaults, `deno lint`. It uses SWC so it's blazing fast too.
I would use deno lint and deno fmt. Super fast, no config to wrestle with, no third party dependency. It’s beautiful.
Deno supports NodeJS standard libraries.
How do you use a Joy-Con as a presentation clicker? Dope idea, must share a how-to please! 🤯
Right here! ua-cam.com/video/pIIHJ-NIyes/v-deo.html
Good luck convincing live subtitles that it's not Dino heh
I subtly wishes a million times that the real Ryan would pop up in the show.
i use dynamic imports in a custom static site generator i made (in node tho)
did i hear "surprise based" maybe instead of "promise based"... made my day :)
Surma said Deno instead of Deno a few times. Though he also sometimes said Deno.
Oh, and they have npm package compatibility up at this time
Disadvantage of having window in deno is that it was heavily used to detect the runtime is a web browser.
You should use `typeof document !== "undefined"` to detect DOM, and for everything else you should test for each feature individually. Probing for `window` breaks down in workers.
Import maps are really essential because without it, managing package versions across multiple files (efficiently) would be a pain.
Nah, you could have a central file for the imports and re-export them.
Though that does not support trailing slashes. Say /packageV2.2/ contains 50 different files with different functionality, with re-exports you'd need 50 imports, with import maps you need 1.
@@emilemil1 You'd need 50 imports in both cases, look it up
What import maps achieve is either shorten the import specifiers (not actual imports) or allow you to change deep nested import origins
@@Soremwar With maps you can make a single alias for the package and its version, making package management simple. It's only one line to change if you want to update the version, change host, etc.
With re-exports you need to specify the entire path, with version, for every individual module in the package that you want to import. For 50 modules that's 50 paths compared to 1 import map alias. It's bloated and it's unwieldy to edit without relying on a tool.
It becomes even more advantageous if you need to support different versions of a package in the same project, with maps supporting scopes and not requiring a new alias per version, while re-exports require duplication of all imports and a new alias.
To be extra clear, there is no significant difference outside of the files defining the aliases. An import map is more concise and easier to maintain compared to a file with re-exports.
@@emilemil1 I don't know if you are confused or you haven't actually used deps.ts vs import maps but I can guarantee you, reexports aren't that cumbersome compared to import maps. This is the most straightforward case I could think of to compare them
import { React } from "./deps.ts"; // Regular imports
import React from " react"; // With an import map
0:20 "Let me slowly send microbes to you"
Accessibility Note For Future Videos: When referring to something written on the screen, it would be great if you could read out that content for the benefit of listeners.
I don't know why Javascript community not moved toward it yet
Because it's still in development, things are still being figured out. This means there's no stable ecosystem for you to use yet.
@@spartanatreyu If you are referring to browsers and web standards in general sure. HTTP/2 Server Push has recently been removed right? In our case we have been using it to power a number of fortune 500 companies products for over a year now and has lead to significant improvements in conversion along with lighthouse 100 scores 🎇for bragging rights.
@@markuskobler you might have noticed that those std imports were not at 1.0 yet. Confusingly, Deno is done, but not it's standard library.
@@SimonBuchanNz we are very aware and not had any more issues with upgrading than any other language or framework this last year. If anything Oak, version 9 now, is a bigger source of change. Its no different with node we dont ban usage of express (over 10) or Jest (over 40) because they have dependencies that are version "0.x.x"
Deno is also the future of Machine learning. Thanks
This is the most enjoyable show about development
We did miss this :)
Typing URLs for every package is pretty ugly. I predict they will add some package.json eventually. Or is everybody supposed to create their own `package.ts` and re-export everything?
Yes, you create a deps.ts and export your dependencies from there.
@@callthecapital8631 Thanks. My intuition was correct then. I personally still don't like having to re-implement package.json in every project. :) I feel there should be a way to re-use the package.json model and improve it with additional security. Something that would lock-down "lodash" from accessing my filesystem. :)
@@RushPL1 Or, you know, import maps.
With the permissions and other features (like import-maps) that you have to pass by the command line, I imagine there will be an upsurge of having something like a .sh/.bat to run the script with all the command line arguments that are needed.
sadly deno does not support all crypto.subtle functions - its for now only a small subset unfortunately (i would'nt call it support just now)
I think I can get away with it because I was only claiming that the global exists :3
(Thanks for pointing this out!)
The E in Deno is pronounced the same way as the E in Node.
So the E modifies the O. No confusion now!
At the moment I prefer Node, for many reasons. For example, I prefer the package manager; I use ReScript and I don’t understand why I have to install a TypeScript compiler that I don’t need.
Job wise, Node still wins
Personally, I love Deno.
I wish it grows faster though.
amazing!
It's very good
Wheres the Talk?
Linked in the description!
Guess we will ignore the fact that he's using a Nintendo switch controller..ok
thank God im not crazy i have so many questions
When do we get denolectron?
Delectron? Positron? There's got to be a better name.
The Tauri project tauri.studio/en plan to support Deno has backend in the future, but for now it is only rust.
The question is: which intrinsic problem does deno solves that nodejs doesn't and is a deal breaker for the business specs?
My underlying thought is: somebody got tired or felt "stagnant" of a tech that got stable and he needed to invent something else. These kind of experiments are good for the creative mind, but released into the wild not every person has the right judgement to know which is the correct tool to use on any given project. Specially people who want to be cool, accepted on the group of edge tech or just want to show off that they know this new thing.
That “somebody” who start the Deno project is Ryan Dahl, who *also* invented Node. I can only recommend the JSConf.EU talk he gave (link in the description) with a rundown of why he thought it’s worth starting over.
@@dassurma yes I'm perfectly aware it is the same person and because of that people fall into the authority figure "cult", follow blindly someone just because they are famous or did something great once. If deno doesn't solve a business road block that node can't, then there's no point on changing tools other than: look at me, I'm using "cool" new tech and we have to tear down the whole app to build it again. (reminds me of the term hipster)
@@draco_2727 considering that Ryan is notorious for not having social media and generally flying under radar, the "look at me" angle doesn't make sense to me. The reasons he gives in his talk do make sense to me tho. 🤷♂️ After 10+ years of node, the JS language, the npm ecosystem and our knowledge about backend JS have progressed considerably. Its hard to incorporate these lessons learned and new language constructs when you have an entire Standard library with backwards compatibility.
@@dassurma that "look at me" was meant for the users, not the creator. If you can't incorporate lessons learned isn't that somehow failure of the architect(ur)? Is there a new #C response to whatever shortcomings C# had or has, for instance. To be clear, creative minds have to satiate their need for experimenting and creating; in this context the creation and exposure of these tools fall in the hands of millions of developers who fall in a wide range of knowledge, experience, insecurities, peer pressures, etc. Which directly or not result in devs hating JS, being tired of front end development, company's job post that require a unicorn developer to match their tech stack, and so on. I'm at my 19nth year of web development and something I've learned (that's not new) is to keep things simple in the small and grand scheme of things. PS: please take into account that its hard to have a reasonable, sane and productive discussion or conversation just through text like this.
The shown architecture diagram is out of date.
We are working on an up to date one! Should be in the manual soon. The general concept Surma explained is correct though, even if the image is a little dated.
@@lucacasonato nice! goed bezig!
That’s not a slide controller, that’s a joycon
Whole Deno ecosystem with tooling was inspired by Golang, initially Deno was written in Golang
I love Deno!
It's Deno, not Denno.
hmmm denno
@@jakearchibald Haha, your Danone reference isn't wasted on me!
Probably not obvious from my comment, but I agree with you guys. It's 'dee-no', surely :)
Absolutely love you guys and the content ❤
I'm so happy that my native language Finnish doesn't have this pronunciations bs ;)
Deno and node should just merge and work together. I know it’s “impossible possible”, but if there’s the possibility, I think it’s worth making it happen.
i wish some one could teach me deno ") ...................................... awh fleez bad man .
We have a manual to get you started: deno.land/manual@v1.13.1/introduction. You can also hop on Discord if you get stuck on anything specific: discord.gg/deno
the 'over the shoulder' camera is way too distracting for me, please considering not doing it
We have been doing the over-the-shoulder shot _for years_ before WFHTTP 203 (go back to some older episodes!)
web-aJSONed. :)
Goose-bumpy stuF... :)
Deeno is as ridiculous as JIF. I'll stick to Denno and GIF.
But deno sounds like dino which is a Dutch word that means dinosaur and also deno.land means dinosaur country.
Think D-no. The E silent/missing
Atrocious visuals, great conversation.
Please kill JavaScript