CYD and ESP32 Camera

I have a Cheap Yellow Display (CYD) using LVGL and an ESSP32s Camera. I've been searching the internet to find an example of displaying the ESP32s Camera on the CYD screen. I have the CYD in the living area and the camera at the front door. I want to be able to see the person at the door on the CYD. Does anyone have any sample code or know of a site that has an example on it?

I have a few ESP32s controlling 240V appliances around the house using the WiFi on the CYD and ESP32s as server and clients. They work quite well. This is the first camera I have bought and was looking forward to playing with it but it seems that not many people seem to want to do much with them from what I've seen on the net so far. But I could be mistaken.

Any help would be appreciated :grin: and Thank You in advance.

I have no idea what a CYD is, all my camera output goes to a web page.

Please post a link the product page for this item.

This one ESP32s Cam

Cheap Yellow Display (CYD)

Why do you think that has the ESP32S processor?

There are plenty of examples for sending pics via WiFi from the standard ESP32-CAM, which is what that appears to be.

Nice, but I can't seem to buy any. Something to do with my address.

How do you get your cameras to a web page. I have another 5 cameras on the way and I'm getting frustrated.

Do you mean how can you see video and pictures on a web page connected to a camera? The esp32cam has that app as an example in the hardware section of examples, see pic. As far as your OP, put that code on an esp32cam and connect the CYD to another esp32 that has the example webserver code running. It will need some minor changes to use the right IP, which is probably best done via static IPs set up on your router.

I asked me the same question today.
I don't have a solution, but I have an approach that perhaps could work.

In the Github documentation for CYD there is an example “ ESP32-TV ".
The author uses the ffmpeg library to convert the video files on the SD card for display on the CYD.

My idea is that it should be possible to adapt the example and use the ffplay function in ffmpeg (there is a section on streaming in the documentation).

But that's just a first thought - see above: I'm also looking.

here's an example of my live camera feed test, but the CYD is very limited with its buffer rate as crashes with mpeg via wifi, so major reductions in stream are needed.

hope this helps...``

//Seans esp32 CYD live esp32-cam display tests

#include <WiFi.h>
#include <HTTPClient.h>
#include <Adafruit_GFX.h>
#include <Adafruit_ILI9341.h>
#include <JPEGDecoder.h> 

// ESP32-2432S028 (Cheap Yellow Display) pin config
#define TFT_CS   15
#define TFT_DC    2
#define TFT_MOSI 13
#define TFT_SCLK 14
#define TFT_RST  -1
#define TFT_MISO 12
#define TFT_BL   21

// Camera stream URL (replace with your ESP32-CAM IP)
const char* camSnapshotURL = "http://insert ip address from esp32 cam here/capture";

// Wi-Fi credentials
const char* ssid = "...........";
const char* password = ".........";

Adafruit_ILI9341 tft = Adafruit_ILI9341(TFT_CS, TFT_DC, TFT_MOSI, TFT_SCLK, TFT_RST, TFT_MISO);

void setup() {
  Serial.begin(115200);
  delay(100);
  Serial.println("Connecting to WiFi...");

  pinMode(TFT_BL, OUTPUT);
  digitalWrite(TFT_BL, HIGH);

  WiFi.begin(ssid, password);
  while (WiFi.status() != WL_CONNECTED) {
    delay(500);
    Serial.print(".");
  }
  Serial.println("\nWiFi connected!");

  tft.begin();
  tft.setRotation(1);
  tft.fillScreen(ILI9341_BLACK);
  tft.setTextColor(ILI9341_WHITE);
  tft.setTextSize(2);
  tft.setCursor(10, 10);
  tft.println("Requesting snapshot...");
}

void loop() {
  showCameraFrame();
  delay(150); // Reduced delay to improve frame rate
}

void showCameraFrame() {
  HTTPClient http;
  http.begin(camSnapshotURL);
  int httpCode = http.GET();

  if (httpCode == HTTP_CODE_OK) {
    WiFiClient* stream = http.getStreamPtr();
    int len = http.getSize();
    uint8_t buf[1024];  // larger buffer for faster transfer
    int index = 0;

    uint8_t* jpegData = (uint8_t*)malloc(len);
    if (!jpegData) {
      Serial.println("Failed to allocate memory!");
      http.end();
      return;
    }

    while (http.connected() && (len > 0 || len == -1)) {
      size_t size = stream->available();
      if (size) {
        int c = stream->readBytes(buf, min(size, sizeof(buf)));
        memcpy(jpegData + index, buf, c);
        index += c;
        if (len > 0) len -= c;
      }
      delay(1);
    }

    Serial.println("Decoding JPEG...");
    if (JpegDec.decodeArray(jpegData, index)) {
      renderJPEG();
    } else {
      Serial.println("JPEG decode failed!");
    }

    free(jpegData);
  } else {
    Serial.printf("HTTP error: %d\n", httpCode);
  }
  http.end();
}

void renderJPEG() {
  uint16_t *pImg;
  uint16_t mcu_w = JpegDec.MCUWidth;
  uint16_t mcu_h = JpegDec.MCUHeight;
  uint16_t max_x = tft.width();
  uint16_t max_y = tft.height();

  int32_t mcu_x = 0;
  int32_t mcu_y = 0;

  while (JpegDec.read()) {
    mcu_x = JpegDec.MCUx * mcu_w;
    mcu_y = JpegDec.MCUy * mcu_h;

    if ((mcu_x + mcu_w) <= max_x && (mcu_y + mcu_h) <= max_y) {
      pImg = JpegDec.pImage;
      if (pImg) {
        tft.drawRGBBitmap(mcu_x, mcu_y, pImg, JpegDec.MCUWidth, JpegDec.MCUHeight);
      }
    }
    yield();
  }
}

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.