Hey! I just paused the video real quick because I realized, I don't think I've thanked you yet! In the last few weeks, I've spent hours and hours watching your youtube tutorials, many of the same videos more than once, and have been using all of your help to start programming my Pico project. Thank you SO much, your videos are so easy to follow, your explanations are wonderful, and I just couldn't be more appreciative to have you and your guides as a resource! Thanks man!!
This one of the coolest videos and content creators I have ever seen on UA-cam. There is massive gimmickry all over UA-cam and very little real stuff like this. Explanation of the whole frame buffer concept was awesome. I have purchased a SPI LCD last week and I am going to dabble in the code very soon. Respect !!
the display i got was not compatible. ordered one ST7789 based to test your code. One of teh best video on youtube on the argument. Subscribed the channel. Thumbs up!
@@BytesNBits Hi. Would you mind making a new 2-3 minute UA-cam video explaining what a framebuffer is, how it works, the benefits and why it is used - show the bouncing boxes, and the illustration of the Pico connected to the framebuffer and to the screen LCD or other). i.e. Just a sub-set video of this longer video on your channel (without the driver and python info). I would like to put a link to this new video on a Microcontroller forum I'm a member of, and I think it would be of value to the other members. Many thanks.
Nice video, just what I was looking for! Very informative and to the point. I just started with the Pico a month or so ago, I've tried out various bits of hardware, and your display buffer solutions are top-notch! Appreciate all the work you put in to this.
Thanks very much for this video and the continuing series. I dabbled in electronics many years ago, then went wholly into UNIX and Linux computing (and retro gaming) but you and a couple of other YT creators have really got me back into electronics again.
@@BytesNBits Over the past few years, I have just got more and more bored with ever faster and more expensive CPUs and graphics cards to play games that are just complete rubbish! The last AAA game I bought and played (to death) was Fallout 4 back in 2015 - now, for gaming, I just stick to indie and retro games. I think I mentioned to you before that I just run LInux at home now and I am repairing and rebuilding a lot of old PCs and laptops, especially old IBM and Lenovo Thinkpads. I have discovered a new love for a computing hobby that started in the days of the ZX Spectrum and Commodore Amiga. To me, it is SO much more interesting playing with all this old hardware and I just relish the challenge of playing around with lower powered stuff and having memory and CPU speed constraints always be a limiting factor that you have to "engineer" your way around.
Gracias por tu vídeo, me ayudó mucho en un proyecto que tengo en mente y fue muy fácil adaptar su uso a la librería del display ST7735 de 1.8 que planeo usar como reloj. Saludos desde México.
When I first started with the ILI9341, I used the Teensy and Adafruit library's. It was not very impressive! I then looked at their text/font writing code, and they were using pixel writes to do fonts! Awful, lazy coding. I wrote my own routines using block transfers for each character. I was able to obtain a 17x performance increase. The problem for me was the characters appeared upside down and backwards because of the way fonts were stored in flash. Once I re-ordered the fonts, everything was fine. I wonder if you have run into this (have 'they' re-written the text routines)?
Hi. Yes. Some of the library functions are a bit slow. Partly I think this is due to the way the serial screens work. From what I can remember the library routines allow you to write text over the top of other graphics without destroying them. This can only really be done sending individual 'on' pixels. Your block writes are a great idea and as you've shown greatly increase the write speed, but I guess you'll end up with a rectangular background area to each character. For text on a blank background this is great, but I guess the library needs to have the more general purpose case. Extending libraries with your own optimisations will always be the way to go to get the performance you want. So I'd say think of the library as a starting point with the challenge to do it better yourself.
I have no need to do motion graphics controlling machines, via relays, stepper motors, etc. I do need different sized fonts for text. I had never thought about mixing text and graphics in the same space. I also did quite a bit of experimenting on SPI clock rates. It has been years ago, but I think 400K was as high as I could reliably run with these ILI9341 lcds. What has been your experience?
I strongly suspect the PIO alone can convert an RGB332 framebuffer up to RGB565 by just shifting out the MSBs and zero padding. It is fast-enough in C, though. You can do per-scanline conversion into a buffer and transfer it asynchronously with DMA. It's near-enough cost neutral, since your pixels are being converted while the previous batch are transferred. There's a PIO interface in MicroPython that might actually make PIO conversion possible without delving into the horrors of writing MicroPython C extensions (or writing C outright).
Good to know I'm not the only one who's tried a framebuffer approach to speed up display rendering. One comment/question, though. It's hard to tell from this video, but how bad is the screen tearing? I've written a super simple test in C that makes use of a framebuffer and DMA. It fills the entire screen with one color, waits a second, changes to another color, rinse and repeat. On the surface it works as designed, but if one looks closely I can see the display "actively refreshing" -- and it's particularly apparent when the previous color and the next color contrast. Have you seen this as well? Is there a way to mitigate it? Or is it likely simply a result of the display having its own SRAM (as opposed to writing directly to VGA/DVI where the display is expecting constant updates and does not "latch" the display data)? **EDIT** Please ignore this. For those who are having a similar issue described, check the CASET and RASET commands in the ST7735R datasheet. It appears my computed 32K of required memory (for 128x128@16bits) fills the screen. Thus, I was actually rendering twice in one pass. Some rows of memory are rendered offscreen so adjustments using the aforementioned commands are necessary. Still investigating why it won't fill the whole screen in 32K (I had to bump it up to 40K to get it to fill). Much less tearing seen.
Good to hear you're getting the issues sorted. Strange that you're having to send extra data to get the screen to fill. Just a thought, but is the driver chip initialised to the correct resolution. I think there are register values in the initialisation that control this.
Did you try to update by slices or blocks? as an old CRT TV? For example, you could make an image block of 320x60 pixels for the first layer with the frame buffer method and them send it to the display to show it in the upper quarter of the screen. After that, make the second layer and send it to fill the second quarter of the screen. Make the same to the third and fourth. When you start again with the first quarter, you just need to overwrite the previous image block with the new one. You would need only 38 kB at a time.
when you do the connection images with the pico you should also point out that scl is also clk and sck and miso and that sda is mosi and sci etc rx tx as some boards dont show what it show in the driver even your image now shows lcd clk connected to gp10and the driver doesn't have a clk pin assignment so if people are learning this will confuse them
The Arduino doesn;t support MicroPython so first you'll have to move to C++. The Arduino also lacks the memory space for a frame buffer so you'd need to draw directly to the screen. I did make a video a while back using an SPI screen with Arduino. ua-cam.com/video/Oh9vgomyuOI/v-deo.html
Is it possible to run multiple ST7789 with this driver, I have some 16Mb Pico clones with plenty of RAM for framebuffers and I need to run 8 displays. I have 3 running on a Teensy 4.1 but I've ran out of memory.
I don't see why not. Each instance should be able to run separately but you'll need to make sure you keep control of when each tries to use the SPI channel.
Is it possible to render a full image at 15 frames per second with the ili9341 display? And if not, is there any sort of tutorial for programming the ili9341 screen with C or C++?
I think you've already answered your question. Micropython won't be able to process the full 320x240 pixels at that rate. You might be able to do it using a hybrid DMA and PIO driver but that's getting complex. Dropping into C++ is the way to go. Again you're still going to be at the limits of the Pi Pico but it's your best bet. I've been concentrating on the Pico in Micropython so haven't got a tutorial. I did make one on the standard Arduino in C++ a while back - ua-cam.com/video/4DtuOeeYHys/v-deo.html.
Hi. Would you mind making a new 2-3 minute UA-cam video explaining what a framebuffer is, how it works, the benefits and why it is used - show the bouncing boxes, and the illustration of the Pico connected to the framebuffer and to the screen LCD or other). i.e. Just a sub-set video of this longer video on your channel (without the driver and python info). I would like to put a link to this new video on a Microcontroller forum I'm a member of, and I think it would be of value to the other members. Many thanks.
Thanks for the idea. Don't forget you can link directly to the part of the video where I go over the framebuffer. Just use the UA-cam share button and then tick the use timestamp option.
There seems to be a missing part of the st7789 driver file, which is having problems at run time。 Traceback (most recent call last): File "", line 130, in File "", line 75, in main AttributeError: 'ST7789' object has no attribute 'fill'
Hi. I'm not sure if Micropython can do that in the same way as C etc. You don't have such low level control over where your variables are stored. There are DMA packages that might help. Otherwise you'd need to read the rom data using the file system and then do lookups from the variable in ram.
@@BytesNBits Hi. I've been looking at micropython's inline assembler and wondering if a simple loop would be quick enough: i.e. fetch word from frame buffer , align , look up value in array then squirt it out of the SPI ? A 4bit pallet would be small enough to be in RAM as an array
@@wktodd That does sound possible. I did look at a similar solution using a bytearray on the python side being filled by an assembler code block. Then sending this as a block write from python using 8 blocks to do the full screen to keep memory usage down. I've not tried it though. Your method does seem a slicker solution!
I've been playing with this and I noticed that the driver function "color565()" returns color in brg fornat. The problem is a big-endian/little-endian thing. If you change the color565 function like this it returns rgb like it should. ''' display is big-endian, pico is little-endian ''' def color565(r, g, b): val16 = (r & 0xf8) 3 # swap the bytes: return (val16 >> 8) | ((val16 & 0xff)
I have a question: how would you reduce to 4 bit/px on the 1.14 display? for fbuf variable i thought messing with framebuf.GS4_HMSB, the 4bit format, instead of .RGB565 but im not sure about that or what to do to the bytearray equation or other variables.
There are a range of ways to reduce the memory size but mostly they then involve some sort of translation of the frame buffer values to the RGB565 values. Micropython isn't fast enough for that. The framebuffer may have some inbuilt conversion but I haven't looked into that (if so it will be in C so should run at a good speed). All I can really say is have a play!
@@BytesNBits thanks, I do need to keep playing around to learn more about this. I appreciate the response and your helpful videos throughout the years, cheers
Ok i've got the display. 240*240 ips based on st7789 .... now i got pin1=gnd, pin2=VDC Power 3.3 V, pin 6=DC. I can imagine pin 5= RESet, but how can i map pin 3=SCL, pin4=SDA and pin 7=BLK to you ILI based sck, mosi, miso, cs=Pin(17)?
Hi. SCL on your LCD is the clock signal (SCK), SDA is MOSI. MISO is not used. BLK is probably the backlight connection. You'll need to look at the datasheet for your display.
thanks for the helpful vids - question for you: when I use the color565() function with the display (not using framebuffer) the colors are correct. If i use the buffer and pass in the color565() function, the colors are different/incorrect. any reason why that may be the case?
I ran your example code and removed the random color generator on the boxes and set a default value. same issue -- solely changing the framebuffer changes the color of the box. maybe it's to do with my screen: its an adafruit 3.5" tft which uses the HX8357D chipset.
if anyone else has the same issue the fix is the color565 func needs to be updated to the below since the framebuffer class uses LSB instead of MSB byte order def color565(r, g, b): val16 = (r & 0xf8) 3 # swap the bytes: return (val16 >> 8) | ((val16 & 0xff)
Hi. First off make sure you're using the right driver library for your chipset. If that's all OK it might be a colour order bug. The framebuffer might be decoding the 565 value and then re-encoding it when writing to the screen. I'd try drawing all the boxes in red - RGB(255,0,0) and see what comes out of the buffer. If it's a pure green or blue then you know its the byte order.
I think so. I did have a look at that as a way around it but haven't tried coding it yet. I couldn't get my head around some bytes needing to be sent as is for the control codes etc. and some needing the nits shifted around for the colours.
I tried it on my esp32, and it works fine, but it turns out that i have not enough memory to allocate whole buffer, so i rendered only a part of my screen. Can i do something about it so i may render the whole screen or i clearly can't bcs of having not enought memory ? Btw video is great and i learned a lot
Check if your ESP32 has some PSRAM attached. This is a fast spi ram that you can use. I'm pretty sure you can get Microptyhon to use it but I haven't tried it myself. Otherwise, yes, you're stuck :)
im using a waveshare display that require the st7789 driver and i seemed to have everthing uploaded correctly. when i do import boxes buffer i get this.. SPI(1, baudrate=31250000, polarity=1, phase=1, bits=8, sck=10, mosi=11, miso=8) 51.27% 32.67308 32.57108 .. in REPL which tells me its running and is reporting fps but the screen is off, no backlight or anything. is there a way to turn the screen on or should it be turning on just from having the pico connected to it and connected to power?(which would mean my display is a dud?)
There should be a backlight input on the screen. Usually labelled LED or something like that. It usually needs connected to the either 5V or 3.3V depending on your screen.
@@BytesNBits Thanks for taking the time to respond! Its one of those like in your video where all the pico w pins plug into the back, and it does have an lcd backlight pin, i made sure to specify that in the code as well as all the others. im thinkin its a dud /:
i just noticed it was reporting wrong miso pin but changing that still didnt work. but when i just did the import again and i was still getting fps in REPL but after i do ctrl C to stop it at the end it says display off like below SPI(1, baudrate=31250000, polarity=1, phase=1, bits=8, sck=10, mosi=11, miso=12) 55.18% 45.31691 45.1198 45.09681 45.09504 display off does that mean the display is "turning off" ffrom me stopping the code from running anymore? should it be saying display on at the beginning? i dont get what im missing here lol
ok, i mapped the 240x240 st7789 with spi1_sck=10, spi1_mosi=11, st7789_res = 12, and st7789_dc = 13 Modified both screen_width/heigh to 240 (and the buffer width/heigh to 240), but now i got a MemoryError: memory allocation failed, allocating 115200 bytes
Hi. You'll need to reduce the active area on the screen as you've got too many pixels for the Pico to handle. Have a look at ua-cam.com/video/fGfb2NvDlG4/v-deo.html
Is it possible for the Pico to read a video file from a micro SD card and display it on a LCD without needing to be connected to a PC? I just want to be able to flip a switch to turn a standalone Pico on and have it start playing the video file automatically. Is this possible? Sorry for the newbie question.
I doubt if the Pico would manage a video file. To read the SD card, decode the images, and then send it to the screen would be a push for the system. It could probably be done but you'd be looking at a few seconds per frame speed. A slide show of still images is possible. I did an Arduino tutorial a while ago using bitmap images. ua-cam.com/video/exjzGMX4fqs/v-deo.html
@@BytesNBits What are your thoughts on pairing the Pico with the VGA Pico Demo Board? Seems to have video output, output audio jack and a micro SD slot.
Sir i would like to build a hand held nes emulator with the pi pico and one of these little red displays...i cand find anyone who has done this. I have already built the emulator with the pico pi but getting it to work with the little display has got me stumped....any thoughts on the matter?
You need to look at the display output code in the emulator to see if it supports SPI screens. If not you'll need to code it. Usually the emulator will have some sort of frame buffer in memory which will be sent to some display output code. You need to have our code take control and send the frame to the screen as in the video. Good luck!
The time and effort you put into all your videos is much appreciated
Thanks. Glad you're finding them useful.
Hey! I just paused the video real quick because I realized, I don't think I've thanked you yet! In the last few weeks, I've spent hours and hours watching your youtube tutorials, many of the same videos more than once, and have been using all of your help to start programming my Pico project. Thank you SO much, your videos are so easy to follow, your explanations are wonderful, and I just couldn't be more appreciative to have you and your guides as a resource! Thanks man!!
Thanks. Glad to hear you're finding this stuff useful! Good luck with the project.
This one of the coolest videos and content creators I have ever seen on UA-cam. There is massive gimmickry all over UA-cam and very little real stuff like this. Explanation of the whole frame buffer concept was awesome. I have purchased a SPI LCD last week and I am going to dabble in the code very soon. Respect !!
Thanks for your comments. Good luck with the project.
the display i got was not compatible. ordered one ST7789 based to test your code. One of teh best video on youtube on the argument. Subscribed the channel. Thumbs up!
That's great. Thanks.
I love how you explained this - especially with the graphics.
Thanks. Glad you enjoyed it.
@@BytesNBits Hi. Would you mind making a new 2-3 minute UA-cam video explaining what a framebuffer is, how it works, the benefits and why it is used - show the bouncing boxes, and the illustration of the Pico connected to the framebuffer and to the screen LCD or other). i.e. Just a sub-set video of this longer video on your channel (without the driver and python info). I would like to put a link to this new video on a Microcontroller forum I'm a member of, and I think it would be of value to the other members. Many thanks.
Very thoughtful way to achieve the desired fps throughput. Appreciate the effort put to explain the details.👍👍 👍
No problem. Hope it helped.
Yes, of course
Nice video, just what I was looking for! Very informative and to the point.
I just started with the Pico a month or so ago, I've tried out various bits of hardware, and your display buffer solutions are top-notch!
Appreciate all the work you put in to this.
Glad it was helpful!
Thanks very much for this video and the continuing series. I dabbled in electronics many years ago, then went wholly into UNIX and Linux computing (and retro gaming) but you and a couple of other YT creators have really got me back into electronics again.
Great to hear! I started in electronics but then moved into web coding. I find these sorts of projects great fun to play around with.
@@BytesNBits Over the past few years, I have just got more and more bored with ever faster and more expensive CPUs and graphics cards to play games that are just complete rubbish! The last AAA game I bought and played (to death) was Fallout 4 back in 2015 - now, for gaming, I just stick to indie and retro games.
I think I mentioned to you before that I just run LInux at home now and I am repairing and rebuilding a lot of old PCs and laptops, especially old IBM and Lenovo Thinkpads.
I have discovered a new love for a computing hobby that started in the days of the ZX Spectrum and Commodore Amiga. To me, it is SO much more interesting playing with all this old hardware and I just relish the challenge of playing around with lower powered stuff and having memory and CPU speed constraints always be a limiting factor that you have to "engineer" your way around.
Gracias por tu vídeo, me ayudó mucho en un proyecto que tengo en mente y fue muy fácil adaptar su uso a la librería del display ST7735 de 1.8 que planeo usar como reloj. Saludos desde México.
Great. I hope the project is working well.
Simply awesome, thank you! 😊🕊
Glad you enjoyed it!
When I first started with the ILI9341, I used the Teensy and Adafruit library's. It was not very impressive!
I then looked at their text/font writing code, and they were using pixel writes to do fonts! Awful, lazy coding. I wrote my own routines using block transfers for each character. I was able to obtain a 17x performance increase. The problem for me was the characters appeared upside down and backwards because of the way fonts were stored in flash. Once I re-ordered the fonts, everything was fine. I wonder if you have run into this (have 'they' re-written the text routines)?
Hi. Yes. Some of the library functions are a bit slow. Partly I think this is due to the way the serial screens work. From what I can remember the library routines allow you to write text over the top of other graphics without destroying them. This can only really be done sending individual 'on' pixels. Your block writes are a great idea and as you've shown greatly increase the write speed, but I guess you'll end up with a rectangular background area to each character. For text on a blank background this is great, but I guess the library needs to have the more general purpose case.
Extending libraries with your own optimisations will always be the way to go to get the performance you want. So I'd say think of the library as a starting point with the challenge to do it better yourself.
I have no need to do motion graphics controlling machines, via relays, stepper motors, etc. I do need different sized fonts for text. I had never thought about mixing text and graphics in the same space. I also did quite a bit of experimenting on SPI clock rates. It has been years ago, but I think 400K was as high as I could reliably run with these ILI9341 lcds. What has been your experience?
I strongly suspect the PIO alone can convert an RGB332 framebuffer up to RGB565 by just shifting out the MSBs and zero padding. It is fast-enough in C, though. You can do per-scanline conversion into a buffer and transfer it asynchronously with DMA. It's near-enough cost neutral, since your pixels are being converted while the previous batch are transferred.
There's a PIO interface in MicroPython that might actually make PIO conversion possible without delving into the horrors of writing MicroPython C extensions (or writing C outright).
Hi. Thanks for the tips. You're right that the PIO route looks like a good bet.
great video much to learn from this ❤
Thanks
Good to know I'm not the only one who's tried a framebuffer approach to speed up display rendering. One comment/question, though. It's hard to tell from this video, but how bad is the screen tearing? I've written a super simple test in C that makes use of a framebuffer and DMA. It fills the entire screen with one color, waits a second, changes to another color, rinse and repeat. On the surface it works as designed, but if one looks closely I can see the display "actively refreshing" -- and it's particularly apparent when the previous color and the next color contrast. Have you seen this as well? Is there a way to mitigate it? Or is it likely simply a result of the display having its own SRAM (as opposed to writing directly to VGA/DVI where the display is expecting constant updates and does not "latch" the display data)?
**EDIT** Please ignore this. For those who are having a similar issue described, check the CASET and RASET commands in the ST7735R datasheet. It appears my computed 32K of required memory (for 128x128@16bits) fills the screen. Thus, I was actually rendering twice in one pass.
Some rows of memory are rendered offscreen so adjustments using the aforementioned commands are necessary. Still investigating why it won't fill the whole screen in 32K (I had to bump it up to 40K to get it to fill). Much less tearing seen.
Good to hear you're getting the issues sorted. Strange that you're having to send extra data to get the screen to fill. Just a thought, but is the driver chip initialised to the correct resolution. I think there are register values in the initialisation that control this.
Did you try to update by slices or blocks? as an old CRT TV? For example, you could make an image block of 320x60 pixels for the first layer with the frame buffer method and them send it to the display to show it in the upper quarter of the screen. After that, make the second layer and send it to fill the second quarter of the screen. Make the same to the third and fourth. When you start again with the first quarter, you just need to overwrite the previous image block with the new one. You would need only 38 kB at a time.
That's something to try out. Thanks.
when you do the connection images with the pico you should also point out that scl is also clk and sck and miso and that sda is mosi and sci etc rx tx as some boards dont show what it show in the driver even your image now shows lcd clk connected to gp10and the driver doesn't have a clk pin assignment so if people are learning this will confuse them
Thanks for the comments. Yes. I guess there are quite a few combinations of terms out there.
Amazing video! Maybe a stupid question, does this work on an Arduino too?
The Arduino doesn;t support MicroPython so first you'll have to move to C++. The Arduino also lacks the memory space for a frame buffer so you'd need to draw directly to the screen. I did make a video a while back using an SPI screen with Arduino. ua-cam.com/video/Oh9vgomyuOI/v-deo.html
@BytesNBits thank you, I was thinking if it was possible with additional ram, and a 4 bit lookup table for colors
Is it possible to run multiple ST7789 with this driver, I have some 16Mb Pico clones with plenty of RAM for framebuffers and I need to run 8 displays. I have 3 running on a Teensy 4.1 but I've ran out of memory.
I don't see why not. Each instance should be able to run separately but you'll need to make sure you keep control of when each tries to use the SPI channel.
Is it possible to render a full image at 15 frames per second with the ili9341 display? And if not, is there any sort of tutorial for programming the ili9341 screen with C or C++?
I think you've already answered your question. Micropython won't be able to process the full 320x240 pixels at that rate. You might be able to do it using a hybrid DMA and PIO driver but that's getting complex. Dropping into C++ is the way to go. Again you're still going to be at the limits of the Pi Pico but it's your best bet.
I've been concentrating on the Pico in Micropython so haven't got a tutorial. I did make one on the standard Arduino in C++ a while back - ua-cam.com/video/4DtuOeeYHys/v-deo.html.
Hi. Would you mind making a new 2-3 minute UA-cam video explaining what a framebuffer is, how it works, the benefits and why it is used - show the bouncing boxes, and the illustration of the Pico connected to the framebuffer and to the screen LCD or other). i.e. Just a sub-set video of this longer video on your channel (without the driver and python info). I would like to put a link to this new video on a Microcontroller forum I'm a member of, and I think it would be of value to the other members. Many thanks.
Thanks for the idea. Don't forget you can link directly to the part of the video where I go over the framebuffer. Just use the UA-cam share button and then tick the use timestamp option.
@@BytesNBits thanks. I didn't know we could do that. Learn something new everyday!
There seems to be a missing part of the st7789 driver file, which is having problems at run time。
Traceback (most recent call last):
File "", line 130, in
File "", line 75, in main
AttributeError: 'ST7789' object has no attribute 'fill'
Check you method call. I don't think the method is just fill. Should be fill_rect or something like that.
Great stuff. What's micopython like at handling rom based look up tables? Thinking 4 or 8 bit pallet.
Hi. I'm not sure if Micropython can do that in the same way as C etc. You don't have such low level control over where your variables are stored. There are DMA packages that might help. Otherwise you'd need to read the rom data using the file system and then do lookups from the variable in ram.
@@BytesNBits Hi. I've been looking at micropython's inline assembler and wondering if a simple loop would be quick enough: i.e. fetch word from frame buffer , align , look up value in array then squirt it out of the SPI ?
A 4bit pallet would be small enough to be in RAM as an array
@@wktodd That does sound possible. I did look at a similar solution using a bytearray on the python side being filled by an assembler code block. Then sending this as a block write from python using 8 blocks to do the full screen to keep memory usage down. I've not tried it though. Your method does seem a slicker solution!
I've been playing with this and I noticed that the driver function "color565()" returns color in brg fornat. The problem is a big-endian/little-endian thing. If you change the color565 function like this it returns rgb like it should.
''' display is big-endian, pico is little-endian '''
def color565(r, g, b):
val16 = (r & 0xf8) 3
# swap the bytes:
return (val16 >> 8) | ((val16 & 0xff)
That's great. Thanks for the tip and the code!
I have a question: how would you reduce to 4 bit/px on the 1.14 display? for fbuf variable i thought messing with framebuf.GS4_HMSB, the 4bit format, instead of .RGB565 but im not sure about that or what to do to the bytearray equation or other variables.
There are a range of ways to reduce the memory size but mostly they then involve some sort of translation of the frame buffer values to the RGB565 values. Micropython isn't fast enough for that. The framebuffer may have some inbuilt conversion but I haven't looked into that (if so it will be in C so should run at a good speed). All I can really say is have a play!
@@BytesNBits thanks, I do need to keep playing around to learn more about this. I appreciate the response and your helpful videos throughout the years, cheers
Ok i've got the display. 240*240 ips based on st7789 .... now i got pin1=gnd, pin2=VDC Power 3.3 V, pin 6=DC. I can imagine pin 5= RESet, but how can i map pin 3=SCL, pin4=SDA and pin 7=BLK to you ILI based sck, mosi, miso, cs=Pin(17)?
Hi. SCL on your LCD is the clock signal (SCK), SDA is MOSI. MISO is not used. BLK is probably the backlight connection. You'll need to look at the datasheet for your display.
As an absolute beginner I can't really understand it but I'd like to commend you on your hard work. It's a shame you have so few subs.
Thanks. I try to keep it suitable for a range of levels but some projects are a bit more involved.
thanks for the helpful vids - question for you: when I use the color565() function with the display (not using framebuffer) the colors are correct. If i use the buffer and pass in the color565() function, the colors are different/incorrect.
any reason why that may be the case?
I ran your example code and removed the random color generator on the boxes and set a default value. same issue -- solely changing the framebuffer changes the color of the box. maybe it's to do with my screen: its an adafruit 3.5" tft which uses the HX8357D chipset.
if anyone else has the same issue the fix is the color565 func needs to be updated to the below since the framebuffer class uses LSB instead of MSB byte order
def color565(r, g, b):
val16 = (r & 0xf8) 3
# swap the bytes:
return (val16 >> 8) | ((val16 & 0xff)
Hi. First off make sure you're using the right driver library for your chipset. If that's all OK it might be a colour order bug. The framebuffer might be decoding the 565 value and then re-encoding it when writing to the screen. I'd try drawing all the boxes in red - RGB(255,0,0) and see what comes out of the buffer. If it's a pure green or blue then you know its the byte order.
If the colour translation from 8 but to 16 bit involved only bit moved, could the PIO do it with the state machine upon transmission?
I think so. I did have a look at that as a way around it but haven't tried coding it yet. I couldn't get my head around some bytes needing to be sent as is for the control codes etc. and some needing the nits shifted around for the colours.
@@BytesNBits got it, thanks. Love your channel
Very cool!
Thanks. Good to see you again.
I tried it on my esp32, and it works fine, but it turns out that i have not enough memory to allocate whole buffer, so i rendered only a part of my screen. Can i do something about it so i may render the whole screen or i clearly can't bcs of having not enought memory ? Btw video is great and i learned a lot
Check if your ESP32 has some PSRAM attached. This is a fast spi ram that you can use. I'm pretty sure you can get Microptyhon to use it but I haven't tried it myself. Otherwise, yes, you're stuck :)
im using a waveshare display that require the st7789 driver and i seemed to have everthing uploaded correctly. when i do import boxes buffer i get this..
SPI(1, baudrate=31250000, polarity=1, phase=1, bits=8, sck=10, mosi=11, miso=8)
51.27%
32.67308
32.57108
.. in REPL which tells me its running and is reporting fps but the screen is off, no backlight or anything. is there a way to turn the screen on or should it be turning on just from having the pico connected to it and connected to power?(which would mean my display is a dud?)
There should be a backlight input on the screen. Usually labelled LED or something like that. It usually needs connected to the either 5V or 3.3V depending on your screen.
@@BytesNBits Thanks for taking the time to respond! Its one of those like in your video where all the pico w pins plug into the back, and it does have an lcd backlight pin, i made sure to specify that in the code as well as all the others. im thinkin its a dud /:
i just noticed it was reporting wrong miso pin but changing that still didnt work. but when i just did the import again and i was still getting fps in REPL but after i do ctrl C to stop it at the end it says display off like below
SPI(1, baudrate=31250000, polarity=1, phase=1, bits=8, sck=10, mosi=11, miso=12)
55.18%
45.31691
45.1198
45.09681
45.09504
display off
does that mean the display is "turning off" ffrom me stopping the code from running anymore? should it be saying display on at the beginning? i dont get what im missing here lol
ok, i mapped the 240x240 st7789 with spi1_sck=10, spi1_mosi=11, st7789_res = 12, and st7789_dc = 13 Modified both screen_width/heigh to 240 (and the buffer width/heigh to 240), but now i got a MemoryError: memory allocation failed, allocating 115200 bytes
Hi. You'll need to reduce the active area on the screen as you've got too many pixels for the Pico to handle. Have a look at ua-cam.com/video/fGfb2NvDlG4/v-deo.html
@@BytesNBits Thanks, i'll check.
Is it possible for the Pico to read a video file from a micro SD card and display it on a LCD without needing to be connected to a PC? I just want to be able to flip a switch to turn a standalone Pico on and have it start playing the video file automatically. Is this possible? Sorry for the newbie question.
I doubt if the Pico would manage a video file. To read the SD card, decode the images, and then send it to the screen would be a push for the system. It could probably be done but you'd be looking at a few seconds per frame speed. A slide show of still images is possible. I did an Arduino tutorial a while ago using bitmap images. ua-cam.com/video/exjzGMX4fqs/v-deo.html
@@BytesNBits What are your thoughts on pairing the Pico with the VGA Pico Demo Board? Seems to have video output, output audio jack and a micro SD slot.
Doesnt seem to be working on thonny, I just get an error.
Hi. What error do you get?
Nice project! Now,how to create a PDA-like system with SPI LCD + RP2040.
Over to you :)
Sir i would like to build a hand held nes emulator with the pi pico and one of these little red displays...i cand find anyone who has done this. I have already built the emulator with the pico pi but getting it to work with the little display has got me stumped....any thoughts on the matter?
You need to look at the display output code in the emulator to see if it supports SPI screens. If not you'll need to code it. Usually the emulator will have some sort of frame buffer in memory which will be sent to some display output code. You need to have our code take control and send the frame to the screen as in the video. Good luck!