- 98
- 49 131
P N
Приєднався 25 лют 2013
Transform Blender animation into a laser-cut sculpture for projection mapping
1. Download Inkscape: inkscape.org/release/inkscape-1.4/
2. Make a simple box in Makercase: en.makercase.com/#/
0:00 Introduction
1:35 Design a simple lasercut box with MakerCase
3:43 Download Inkscape
4:21 Import the MakerCase box into Inkscape and edit it
11:47 Making orthographic renders in Blender
32:56 Trace orthographic bitmap renders into vector outlines in Inkscape
51:58 Save as DXF AutoCAD R14 file for laser-cutting
2. Make a simple box in Makercase: en.makercase.com/#/
0:00 Introduction
1:35 Design a simple lasercut box with MakerCase
3:43 Download Inkscape
4:21 Import the MakerCase box into Inkscape and edit it
11:47 Making orthographic renders in Blender
32:56 Trace orthographic bitmap renders into vector outlines in Inkscape
51:58 Save as DXF AutoCAD R14 file for laser-cutting
Переглядів: 308
Відео
Blender to ComfyUI for Beginners
Переглядів 2,8 тис.14 днів тому
As an extension of the Vanitas project, this video explains how ComfyUI works and how we can use it to integrate Stable Diffusion into our Blender workflow. Links: 1. The tutorial was inspired by Mickmumpitz's much more detailed workflow, which you can explore here: ua-cam.com/video/8afb3luBvD8/v-deo.htmlsi=XNBVlRau6Yv0ThLm 2. Installing ComfyUI on Mac tutorial: ua-cam.com/video/m9jg1fdOiVY/v-d...
Two quick ways to get vectors from Blender
Переглядів 74Місяць тому
Method 1: Freestyle SVG Exporter Method 2: Render a clean frame and do a bitmap trace in Inkscape Download Inkscape: inkscape.org/ 0:00 Introduction 0:45 Freestyle SVG Exporter 5:09 Tracing vectors from PNG renders in Inkscape
Video to Mocap to Abstract Animation in Blender (Beginner Level)
Переглядів 355Місяць тому
This tutorial is aimed for beginners and uses smartphone video, Blender and the Rokoko Vision online service to generate and animate motion capture video. My apologies that the mocap skeleton rig from Rokoko has a very strange rest position, I didn't do any retargeting here as I wanted to keep it as simple as possible for beginners. 0:00 Introduction 1:23 Uploading video to Rokoko and generatin...
Remote Gestures: A Prototype Performance, Visualisation Research Centre, Hong Kong, 2023.
Переглядів 932 місяці тому
This performance was based on a research project led by Roberto Alonso Trillo and Peter A C Nelson that sought to link creative processes and devices that were difficult to access during the COVID-19 period. Using our native disciplines as a test case, we explored how to remotely connect a violinist and a painter over any distance with minimum latency and maximum creative exchange. This went th...
Machine Visions Video Reel
Переглядів 2,1 тис.2 місяці тому
This is a short video reel documenting some physical exhibits and performances made as part of the Machine Visions exhibition, held at the Osage Art Foundation from November 2022 - February 2023. This exhibition focused on how machine learning is changing the nature of the creative process and showcased artworks and performances made using bespoke machine learning systems for generating sound, ...
Combining rotation and linear acceleration from an IMU to drive motion in TouchDesigner
Переглядів 733 місяці тому
Combining rotation and linear acceleration from an IMU to drive motion in TouchDesigner
Using an IMU to drive colour parameters in TouchDesigner
Переглядів 593 місяці тому
Selecting an appropriate signal from the IMU and mapping it onto the hue offset in TouchDesigner.
Using IMU data to control a switch in TouchDesigner
Переглядів 513 місяці тому
Selecting an appropriate signal, mapping it into integers, and using these integers to drive a switch.
Rotating 3D objects in TouchDesigner with IMU Accelerometer Data
Переглядів 1883 місяці тому
Using either TouchOSC or the SensorTile, this video shows how to extract a rotation signal and select and map it onto the appropriate transforms for a 3D object.
Connecting SensorTile and TouchOSC to TouchDesigner
Переглядів 1183 місяці тому
1. Bridging app for the SensorTile can be downloaded here: github.com/MetaBow/MetaBow-Toolkit 2. TouchOSC Bridging app can be downloaded here: hexler.net/touchosc
IMU Sensors for Touch Designer Intro
Переглядів 1163 місяці тому
We will introduce the TouchDesigner file in the workshop with some beginner ideas for using IMU signals to drive various operations in TouchDesigner.
Touch Designer to XArm Robot: Master Pattern Generation File Walkthrough
Переглядів 2019 місяців тому
Here's the TD file. Please feel welcome to make improvements and share! www.dropbox.com/scl/fo/6gf03t81ttj3uo46lu3bf/h?rlkey=ebd53zxb8l89l29afy68pjyvp&dl=0
Touch Designer to XArm Robot: Drawing a Circle and altering with sound
Переглядів 1629 місяців тому
Touch Designer to XArm Robot: Drawing a Circle and altering with sound
Touch Designer to XArm Robot: Simple Noise Drawing Network
Переглядів 1919 місяців тому
Touch Designer to XArm Robot: Simple Noise Drawing Network
Touch Designer to XArm Robot: Simple LFO Network
Переглядів 1939 місяців тому
Touch Designer to XArm Robot: Simple LFO Network
Laser etch a 3D object into a plexiglas stack from Blender to Inkscape
Переглядів 968Рік тому
Laser etch a 3D object into a plexiglas stack from Blender to Inkscape
Exporting from Blender Geometry Nodes to Ceramic 3D Printer
Переглядів 404Рік тому
Exporting from Blender Geometry Nodes to Ceramic 3D Printer
Simplified One and Two-Point Perspective from Observation
Переглядів 117Рік тому
Simplified One and Two-Point Perspective from Observation
Vanitas Exercise 6: Animating with Noise Curve Modifiers and Exporting your Animation
Переглядів 422Рік тому
Vanitas Exercise 6: Animating with Noise Curve Modifiers and Exporting your Animation
Vanitas Exercise 5: Keyframe and Physics Animation
Переглядів 420Рік тому
Vanitas Exercise 5: Keyframe and Physics Animation
Vanitas Exercise 4: Procedural Lily Flowers
Переглядів 1,3 тис.Рік тому
Vanitas Exercise 4: Procedural Lily Flowers
Vanitas Exercise 3: Vase Geometry Nodes and Texturing
Переглядів 673Рік тому
Vanitas Exercise 3: Vase Geometry Nodes and Texturing
Vanitas Exercise 2: Importing external models and lighting
Переглядів 606Рік тому
Vanitas Exercise 2: Importing external models and lighting
Vanitas Exercise 1: Modelling and texturing a book
Переглядів 1,1 тис.Рік тому
Vanitas Exercise 1: Modelling and texturing a book
The comfyui client is about to lauch which will make installing comfy extremely easy!! ;)
I am currently experimenting a lot with comfy. And I am really looking for someone who might be interested in having a bit of a learning friend who’s also in the medium level of comfy ui stuff. I feel like if you’re here. We might be on the same level. So let me know if anyone out there is interested in making some movies together and connect up :)
U can use mask conditioning to keep the ball be a ball, for example. A bit of a hack here but might work since the path is fairly simple. Mask the path and increase the power of the "ball" prompt for that area specifically, then combine conditionings and hook that to the ksampler
Cool idea, I had the thought if the ai could retain the information of the models and use that to create the image instead, needing it to be baked into the rendering software I suppose.
I have heard of things like real-time mocap where live rig data from a mocap suit is fed in to drive a mesh, which can then be reskinned using an SD model. And there are a few pipelines that generate textures that then get piped into Blender. I have no idea if the actual mesh scene with face and vertex info could be fed in before rendering to frames, but it seems there are so many tools being produced, who knows, might already exist or be on it's way?
You have a very interesting channel. I assume you’re a teacher or lecturer of some form. If so, what do you teach?
Hi there, I teach a broad range of things, from traditional drawing to 3D graphics, fabrication and interactivity. I make these videos to help with my classes, but I also keep them around on UA-cam in case they are useful to anyone else, or in case I make some dumb mistake, in which case people usually correct me in the comments, and I try to fix my approach 😀
Excellent video, and EXACTLY what I needed, as it turns out.
Thanks for the video and detailed explanation. I am trying to export a globe with a geometric pattern (uploaded .png file with pattern) I added to the whole globe in the Materials tab. When I try to export it using the Freestyle SVG exporter, it only exports the outline of the globe and not the outline of the pattern on the surface of the globe. How can I fix this issue? Your assistance is much appreciated
If the pattern is articulated by the geometry of the mesh (not just a texture), then you need to adjust the Freestyle export settings in the View Layer tab (below render and output in your Outliner). Try 'Crease' or 'Suggestive Contour' or 'Ridge and Valley' and see if you get the results you want. If the pattern is just a texture, you can always render an image, then open it in Inkscape and do a bitmap trace to generate vector lines.
HKBU students: 🗿 Random blender user finding this class: 😱😳🙀
BRO THAT YOU SO MUCH YOU ARE A HERO i was making a 3d model of my grampas dog who died to cheer him up and this was a life saver
Great work!!
Typo. Weird 😅
Why not😂
Without additional context this clip is both confusing and meaningless.
Lol. Thanks. I guess it's not really supposed to make sense by itself, it's a documentation reel from an exhibition we did 3 years ago that we usually present in context, either on our website or when describing what certain parts of the exhibition or performance program looked like to people who want to know more about it. I didn't expect many people to watch this out of context, if I did I could have put a voice-over or some text in.
What even is this lol? 😂
Yeah, I can see that this must have looked very random, I'm amazed that it made it into people's feeds. I've updated the video description to give more context. This short reel is supplementary material for when we want to present some of the artworks and performances that were part of an exhibition and performance series we made 3 years ago. More context on the video is usually given in person when we present it, or when it's embedded in a website.
@@peternelson8856 I think the fact it had little context actually made it more interesting to me personally. I also think it's the reason it reached my feed so its a welcomed surprise.
@@skrillex544 That's super funny. Glad you enjoyed the out-of-context exhibition documentation!
Thank you! Great tutorial. It would be interesting to animate this lily flower blooming!!
hi! i wrote the port that is in protokol, but still i don't see the data from my touchosc app in the chop
@@sakharov00in hi there, sorry for the delay! Did you press the play icon in TouchOSC to make sure you are sending data?
This is very interesting and gets me thinking too. Thank you for sharing!
Thank you! In column A and column B; where did you paste the longitude and where the latitude coordinates inside?
Thank you very much. It's very nice to see how people use a blender to perform a variety of tasks. Video editing, photo editor, wind tunnel, even laser cutting is possible with the help of Bledner. This is because once you open this program, you don’t want to close it. Translate by google
If you're having problems with the boolean modifier cutting cleanly, try changing the boolean option from Exact to Fast
Here is Retroshaper's excellent geometry nodes tutorial for making a vase generator. 👀👁👀👀👀👁🤔🤷♂ Interesting link........
omg lol. Thank you! There was a crazy flood in my neighbourhood the day I recorded this, clearly my brain wasn't 100%. Thanks for pointing out the error!
@@peternelson8856 Ouch - I hope the neighborhood is recovering well...... Lol - at least you hard a good reason. Many people just forget...
Thank you! But how to export svg - animations and how it then open? And where?
Hi there, sorry for the late response. I don't know the answer to this question, other than to export .svg frame by frame and then use CSS or something like that for your animation. There are other UA-cam tutorials available for this sort of thing.
Thanks a lot! I just got my 3D printer yesterday and didn't have a clue on how to print!
You're welcome, happy printing!
Thanks you so much))))
You're welcome!
is there a way to export freestyle edges just like rhino. it exports multiple objects 2d vector with all shapes follow occlusion like only visible edges of the objects are visible with very sharp accuracy.
If you go into the 'View Layer' Properties tab, you can change which types of edges that will be rendered as vectors, from visible, to silhouette, creases, borders, and so on. To render all edges, use Edge Mark, but to follow certain geometry features, untick Edge Mark and play around with border, silhouette, crease, etc. Does that solve the problem?
is there any method to use occlusion like if you want to export multiple object svg then what is the solution.
If you go into the 'View Layer' Properties tab, you can change which types of edges that will be rendered as vectors, from visible, to silhouette, creases, borders, and so on. To render all edges, use Edge Mark, but to follow certain geometry features, untick Edge Mark and play around with border, silhouette, crease, etc. Does that solve the problem?
@@peternelson8856 will try.
Thank you so much man!!
Merci!!
Saved my college graduation! Thank you 😅
Amazing. I'm happy to hear that!
Thank you so much for the video, it helped a lot! One question: is it possible to also include the faces / colors in the svg render besides the outlines? Many thanks
There are some settings to play around with the colours of the vector lines, but as far as I'm aware, you can't convert colours from the shader to a vector file directly. But read the documentation here and if you find a way, let me know! docs.blender.org/manual/en/latest/render/freestyle/view_layer/line_set.html#face-marks
I'm unable to export an .svg file onlt .png. What am I doing wrong?
Have you ticked all the settings for Freestyle export as well as set an output location for the SVG file?
hello, I was trying exactly the same steps but dimensions are changing. Is there any way to do ?
Can you be a little more specific - which dimensions are changing and in what part of the process?
@@peternelson8856 I created a mesh (100mmx25mm) in blender. When I open from inkspace svg file, it’s like 8 times bigger than what it supposed to be. I still didn’t figure out why it is changing.
@@elifmiami Yes, sorry this is something I need to research a little further and probably need to record another video. I think it might relate to the resolution of your output and the page setup you have in Inkscape. It is discussed in a number of forums: inkscape.org/forums/cutplot/svg-scale-exporting-off/
very informative. only suggestion is to raise the volume of your video audio. I had to crank a rather powerful system just to make out what you're saying. Otherwise, thanks for the lesson!
Yes sorry about the sound. I had the microphone setting wrong for a while. Hopefully I'll find the time to fix it
@@peternelson8856 no worries, don’t mean to discount your efforts(: you helped me figure out exactly what I needed to.
amazing
Btw, sorry for the delay between my voice and the screen capture video, I had to record this one on my laptop, which lags a bit.
This was really helpful, thank you! Any thoughts on how to use the the X, Y, Z coordinates to create points on a mesh / grid? It seems like that bpy module is REALLY feature-rich. I was a bit frustrated that I couldn't find a command to make a point instead of a cube..
Hi MikeGG, I don't know the answer to your question off the top of my head, but from a quick search, it seems that this forum post might have your answer: blender.stackexchange.com/questions/151299/plot-single-vertices-based-on-xy-coordinates-from-csv-file . I don't know if this would be a super clean solution for GPS data, but I'd love to see the result!
@@peternelson8856 Thanks again, indeed that got me started! If you read the CSV into a list mylist = []; while... mylist.append([float(x), float(y), float(z)]) you can then create the point cloud like this pc = point_cloud("point-cloud", mylist) Unfortunately that opens up a whole can of worms on how exactly to go from a point cloud -> surface.
@@manmustbuild Ever got further with that?
Hi there P N. I am not a coder, but was thrilled to have found your tutorial. I have an interest in modeling and 3D rendering, and am looking to test this method for adding 360 degree cameras based on gpx data. Specifically type: Panoramic Equirectangular cameras. Any tips on adjusting the script? Thanks from Canada!
Hi Hadrian, do you mean adding a 360 camera to every GPS co-ordinate within Blender? If that's the case. you would replace the "bpy.ops.mesh" line in the if statement with a line that adds a camera at the float x, y and z coordinates. To work out exactly what this should look like in your Python script, I recommend you change one of your Blender windows to 'Info' and add a camera. The info panel will show you the python command for actions you do in the editor. For example, when you add a camera, the info panel shows "bpy.ops.object.camera_add(enter_editmode=False, align='VIEW', location=(0, 0, 0), rotation=(1.10871, 0.013265, 1.14827), scale=(1, 1, 1))" , and when you change the camera type to panoramic, it will show "bpy.context.object.data.type = 'PANO' " . So if I were you, I'd make a csv with a small number of values and test a script by changing the add cube line to an add camera and set camera to panoramic and equirectangular. If you can get it to work with a small .csv file, then I think you'll be on your way. Have I understood the question correctly?
@@peternelson8856 Hello PN, Thank you for the prompt reply! You certainly did! I will make this attempt and see what sort of results I can get. I find this very exciting and can't wait to experiment. I will share my findings with you and ask if I have any further questions - provided that is alright with you. Thank you so much and have a great week!
@@hadrianlaing7251 Absolutely. I have no idea what you're making, but would be super keen to see the results! Best of luck!
how do you convert your gps coordinate to meters ?
Sorry for the late reply. Haversine formula converts longitude and latitude to metres. You can copy the formula setup from this spreadsheet: docs.google.com/spreadsheets/d/1p0xs_bRE3p-d9-lKL7cuPmq6sAavJ9LIW1U_pi3PDdw/edit?usp=sharing
pքɾօʍօʂʍ 😈
i actually learned something... always know where your dropbox file is ;D
yeah, you can tell I'm a real pro at this lol
Some strokes are not taken into account on my side, like the boundaries of my object (it's just clipped).
Check the Freestyle settings, but there could be outlying factors such as clipping settings on your camera?
Nice bro
I'd like to chime in here. My blender application wasn't saving my file to a place that was accessible for no reason. I saved it to the desktop, and it was all better. I just have to click the icon now.
for some reason i cant mark free style edge, Ive watched this video like 4 times
Check that you've enabled both the FreeStyle SCG Add-on in your Blender Preferences and that you've enabled Freestyle in your Render settings
You probably want to simplify those lines before sending it to cnc or laser cutter.. These are polylines, which you will see with enough resolution. To get curves, simplify in inkscape.
hi; Can we render the back of the monkey in the monkey render?
probably if you reverse normals?
In the View Layer Properties, scroll down to the Freestyle settings and try changing the type from 'visible' to 'hidden'. I think this should give you the hidden side you are after?
damn, blender could use foer lazer cutter too?
hey how did you get that terrain.party website to work everytime i choose an area and download the file i just got a txt file saying there was no data :(
don't use the search function, you need to drag the little rectangle to where you want to download. Also, I have noticed that the website doesn't work that well on Safari
It seems that terrain.party is having some problems. Here is an alternative: tangrams.github.io/heightmapper/#12.48762/22.2338/114.2226
Sorry for the change in sound volume in the Blender clip!
From 17.50 I describe some viewport navigation controls, such as rotating and panning the viewport. Sorry for making you wait almost 18 minutes for this!