Go Down

Topic: LCD + SD on SPI (Read 8127 times) previous topic - next topic

_frank_

thank you SurferTim
can i read out the actual SPI-Modes?

so i can read them, set my own and reset them to old value after doing my LCD-Stuff (making it more flexible)?

SurferTim

#16
Sep 04, 2013, 02:03 pm Last Edit: Sep 04, 2013, 02:07 pm by SurferTim Reason: 1
When you get comfortable with the operation of different modes and bit orders, you can get fancy. I recommend keeping it simple until then.

You can use a variable to keep track of the settings, and change the mode and bit order if it doesn't match what you need for that device.

BTW, those functions that change the mode and bit order are really fast. You are not losing much time calling them like I suggested.

_frank_

_how_ can i read the the SPI-Mode-Values?

SurferTim

There are no read functions for the SPI settings. You would need to check the SPI SPCR register. Here are the write functions if that will help you.
Code: [Select]
void SPIClass::setBitOrder(uint8_t bitOrder)
{
  if(bitOrder == LSBFIRST) {
    SPCR |= _BV(DORD);
  } else {
    SPCR &= ~(_BV(DORD));
  }
}

void SPIClass::setDataMode(uint8_t mode)
{
  SPCR = (SPCR & ~SPI_MODE_MASK) | mode;
}


_frank_

i tried this with initializing and get both running (but strange chars on display).

maybe the speed differs?

SurferTim

#20
Sep 04, 2013, 07:20 pm Last Edit: Sep 05, 2013, 11:46 am by SurferTim Reason: 1
Code: [Select]
This may be part of the strange characters.
[code]  char *dt=GetDateTime();

This returns a pointer to a character array? Where is that character array declared? If not globally, it is out of scope.

edit: If you must, you can change the clock along with the mode and bit order. The LCD may be limited to about 1 MHz clock speed.
Code: [Select]
   // change to LCD settings
   SPI.setDataMode(SPI_MODE3);
   SPI.setBitOrder(LSBFIRST);
   SPI.setClockDivider(SPI_CLOCK_DIV16);

   // do your LCD stuff

   // change back to default (SD, w5100 etc) settings
   SPI.setDataMode(SPI_MODE0);
   SPI.setBitOrder(MSBFIRST);
   SPI.setClockDivider(SPI_CLOCK_DIV4);

[/code]

_frank_

i acknowledge that the code with getDateTime is dirty, but works...the problem is the speed...i regulated the speed for display down (SPI.setClockDivider(SPI_CLOCK_DIV64); in SPI_init_lcd()) and it works

currently the sd ist written with same speed (should be SPI_CLOCK_DIV2 i think), but both is working...

i look for an easy and clear way to format the Datetime (sprintf should be the best, but hast much overhead)

SurferTim

#22
Sep 05, 2013, 11:51 am Last Edit: Sep 05, 2013, 01:36 pm by SurferTim Reason: 1
I'm not sure what you meant, but you should set the speed back to default in SPI_init_sd() or the SPI speed will stay at 250khz for the SD card also.
Code: [Select]
   SPI.setClockDivider(SPI_CLOCK_DIV4);

edit: The SD will work at 250khz, just 16 times slower than it does normally. Any time I can get that kind of speed increase, I take it!

PaulS

Quote
i acknowledge that the code with getDateTime is dirty

Then you should fix it first. Fixing the obvious stuff first makes more sense than tracking down elusive issues that might just possibly go away when the obviously (to us, at least) stuff is demonstrated to be correct(ed).
The art of getting good answers lies in asking good questions.

SurferTim

Quote
i regulated the speed for display down (SPI.setClockDivider(SPI_CLOCK_DIV64); in SPI_init_lcd()) and it works

currently the sd ist written with same speed (should be SPI_CLOCK_DIV2 i think), but both is working...

@majenko: That concludes the test of the SD and other SPI devices working together. Just cold, hard facts.

majenko


Quote
i regulated the speed for display down (SPI.setClockDivider(SPI_CLOCK_DIV64); in SPI_init_lcd()) and it works

currently the sd ist written with same speed (should be SPI_CLOCK_DIV2 i think), but both is working...

@majenko: That concludes the test of the SD and other SPI devices working together. Just cold, hard facts.


Like I said - the Arduino is too slow for it to matter too much - plus, oh look - you slowed it down even more.  And not all cards suffer from it.  And you have a certain amount of separation in the level shifting circuitry.

So yes, I would expect it to work together.  If you'd actually read what I'd written before instead of just going "Nope, I don't agree." to everything blindly, then you'd know that it should work given the circumstances of this setup.

SurferTim

#26
Sep 05, 2013, 08:53 pm Last Edit: Sep 05, 2013, 08:56 pm by SurferTim Reason: 1
Quote
Like I said - the Arduino is too slow for it to matter too much - plus, oh look - you slowed it down even more.  

Nope, not mine. I run it full speed. The LCD will not work at full speed. It is limited to about 1MHz according to the datasheet. You know about those, right?

edit: I still do not agree with anything you said except the 3.3v devices do not need a logic level converter.

majenko


Quote
Like I said - the Arduino is too slow for it to matter too much - plus, oh look - you slowed it down even more.  

Nope, not mine. I run it full speed. The LCD will not work at full speed. It is limited to about 1MHz according to the datasheet. You know about those, right?

edit: I still do not agree with anything you said except the 3.3v devices do not need a logic level converter.


You know, I just can't be bothered with jerks like you any more.  Where's the ignore button?

Go Up