Very nice. Things will be more exciting using spark with all these new features. One question though, on an existing installation running spark2 ,How easy is the upgrade?
from my experience on spark cluster, you will need to install the package on all nodes. I usually use Redhat Ansible to manage multiple nodes configuration with ease.
Deltalake & koalas going to be the game changer in the field data analytics
That's an impressive demo!
when you want to scroll down the Jupyter file in video 30:00 and you end up scrolling UA-cam page :))))))
Where is `plot` function at 17:05 coming from? Does "Apache" Spark natively support displaying dataframes in "Jupyter" notebooks?
the demo was really well done.
Very nice. Things will be more exciting using spark with all these new features. One question though, on an existing installation running spark2 ,How easy is the upgrade?
Can we get the partition pruning demo or video here in UA-cam?
Very good demo, amazing presentation
Michael's energy is infectious also he gave a really good overview of the Spark story.
Invite me next time I'll initiate loud applause on right moments, just feel like it's missing on such a presentation :)
This is amazing !!
6:28.. I wonder if the audience clapped because of 2X or being able to save a few lines of codes..
Do I need to install Koalas on every node of cluster or just on the master?
from my experience on spark cluster, you will need to install the package on all nodes. I usually use Redhat Ansible to manage multiple nodes configuration with ease.
great
nice! when will koalas be available for R?
22:57 can someone give some details about that “forecast=true”
the plot could be their custom function..
with the forecast being a param.. just wild guessing from the looks of it
🤣😁😁 it looked like simple linear forecast. For business case, you need logarithmic fit dependent on time
Can we drop Python support for Spark?
People applauding optimizations that SQL had already introduced 25 years ago. Really kids don't study databases anymore?