Thank you for making this video. I just learned why these are important the hard way when I updated matplotlib and some library I was just experimenting with and completely broke my scripts. Now I have some steps to take to start using these.
Hi Bryan, I am still struggling with this virtual environment concept. Do you create a virtual env for each new dependency? For instance would I need to create a VE for installing "Requests" to call APIs, another one for matplotlib, one for django, and so on? If so, what's the best way to organize the various VEs? I am on Arch Linux. Thanks for your help.
Yeah. Its really a convenience for developing a project. Say you want to build a Django App in Python. First you create a virtual environment to track all the pip installs you do. It saves them to a Requirements.txt file. The installed packages do NOT go to your global default Python folder. They just go to the project folder, i.e. for the app you are developing. When you activate the virtual env, the packages in the requirements.txt file are loaded. When you want to deploy the app somewhere, you can just provide the requirements.txt file so all the required packages and dependencies are loaded there. You can have different python packages and versions in different projects this way. Python projects typically requires many packages with dependencies which would be almost impossible to track manually. virtual env are there to solve this problem. Make sense?
The same way you can install a single package using "pip install " you can install all you dependencies using "pip install -r ". And if you work properly (as tought by Bryan :P ), you run the pip install command while you are in your virtual environment.
Thoroughly enjoyed this and created my virtual environment. Thanks Bryan
Great! Thanks
Thank you for making this video. I just learned why these are important the hard way when I updated matplotlib and some library I was just experimenting with and completely broke my scripts. Now I have some steps to take to start using these.
Yep. But when you learn the hard way, you remember. Thanks
Thanks for helping lift the fog!
You're welcome!
Hi Bryan, I am still struggling with this virtual environment concept. Do you create a virtual env for each new dependency? For instance would I need to create a VE for installing "Requests" to call APIs, another one for matplotlib, one for django, and so on? If so, what's the best way to organize the various VEs? I am on Arch Linux. Thanks for your help.
Yeah. Its really a convenience for developing a project. Say you want to build a Django App in Python. First you create a virtual environment to track all the pip installs you do. It saves them to a Requirements.txt file. The installed packages do NOT go to your global default Python folder. They just go to the project folder, i.e. for the app you are developing. When you activate the virtual env, the packages in the requirements.txt file are loaded. When you want to deploy the app somewhere, you can just provide the requirements.txt file so all the required packages and dependencies are loaded there. You can have different python packages and versions in different projects this way. Python projects typically requires many packages with dependencies which would be almost impossible to track manually. virtual env are there to solve this problem. Make sense?
great stuff Bryan! I have seen in the past using requirements file, how does that come into play with venv or does it come into play at all? thanks!
The same way you can install a single package using "pip install " you can install all you dependencies using "pip install -r ".
And if you work properly (as tought by Bryan :P ), you run the pip install command while you are in your virtual environment.
I do discuss this in the video.