How to simulate a new clock speed using delay

Im working on a project thats uses a uno or mega as an ps2 controller. But I have a question dealing with the delay to match the instructions per cycle in that the ps2 controllers use. i was wondering if the right calculation:
neededDelay = currentClckMicroseconds * wantedClckMicroseconds
16mhz (in us) 250khz (in us)

I’m not sure what you really want to do.. why is there a special instructions per cycle for the PS2 (see this project for example)

To your maths: I’m not sure what is a MHz or KHz in microseconds as a frequency is the inverse of a Time - but whatever you do Multiplying microseconds by microseconds or frequency by frequency does not give microseconds or frequency.... so this can’t be right whatever you do

If you run at 16MHz (16 millions ticks per seconds) and want to run at 2MHz you need to slow down by a factor of 8, which is 16/2=8. - So every tick of the clock needs to generate 1 tick for your new slow clock and 7 ticks of wait. Not sure how you would want (nor why anyway) to handle that at instruction level by simple programming though...are you coding in assembly language?

Delta_G:
Hz is 1/s. MHz would be 1000000/s and KHz would be 1000/s.

You cut my sentence
my point is that a frequency is not a time... it’s the inverse of a Time

Delta_G:
Hz is 1/s. MHz would be 1000000/s and KHz would be 1000/s.

We used to use cps for cycles per second instead of Hertz. It was clearer, but not nearly as cool.

J-M-L:
I’m not sure what you really want to do.. why is there a special instructions per cycle for the PS2 (see this project for example)

To your maths: I’m not sure what is a MHz or KHz in microseconds as a frequency is the inverse of a Time - but whatever you do Multiplying microseconds by microseconds or frequency by frequency does not give microseconds or frequency.... so this can’t be right whatever you do

If you run at 16MHz (16 millions ticks per seconds) and want to run at 2MHz you need to slow down by a factor of 8, which is 16/2=8. - So every tick of the clock needs to generate 1 tick for your new slow clock and 7 ticks of wait. Not sure how you would want (nor why anyway) to handle that at instruction level by simple programming though...are you coding in assembly language?

ok let me be more clear i found the us for 1 instruction in both hertz(0.0625µs for 16mhz and 4µs for 250khz). my question is since i know the clock cycle time for each time can i 0.0625µs*4µs to get the need delay to simulate the frequency?

SpaceGod:
to get the need delay to simulate the frequency?

What do you propose to do with this "delay" - I just can't get figure what is in your mind?

...R

then what is the best way to do this im trying to do it in code and dont want to mess with f_cpu

Delta_G:
Who knows. We're still asking you to make clear what you want.

OK
I am a Computer Science undergrad and Im trying to use an Uno as a digital arcade stick for my ps2(they are really hard to find). I know that the Uno clockspeed is 16mhz, but the psx controller frequency is 250khz. As you can see, 16mhz is way too fast. So I can here to see how would i go about making the uno slow down use code i heard delay is the only way to do this. I know my math is off so I made this to thread so i can get into contact with someone that can explain it to me.

also thank you for letting me know i was wrong and sorry to being vague(i have a problem with explaining things)

SpaceGod:
I know that the Uno clockspeed is 16mhz, but the psx controller frequency is 250khz. As you can see, 16mhz is way too fast. So I can here to see how would i go about making the uno slow down use code i heard delay is the only way to do this.

I suspect you have been badly informed.

You need to tell us what data (or signal) the Arduino needs to produce to emulate the PSX controller. There is a lot more to it than 250 kHz. And it is vanishingly improbable that delayMicroseconds() will be the solution.

...R

i know but shouldnt just be a basic spi slave that what it seemed when i started to do some research online and i was thinking all you has to do is slow the arduino down and code it as such ill link all my resources if needed

https://www.gamesx.com/controldata/psxcont/psxcont.htm

SpaceGod:
i know but shouldnt just be a basic spi slave that what it seemed when i started to do some research online and i was thinking all you has to do is slow the arduino down and code it as such ill link all my resources if needed

https://www.gamesx.com/controldata/psxcont/psxcont.htm

There's nothing on that page that mentions 250 kHz. There's a reference to an 8 MHz crystal for a microcontroller to emulate a controller.

But ultimately what matters is not the clock speed of the microcontroller but the speed (and stuff like framing) of the communication protocol between the microcontroller (which emulates a PSX controller) and the PSX. The PSX doesn't care what clock speed the microcontroller in the game controller runs at.

christop:
There's nothing on that page that mentions 250 kHz. There's a reference to an 8 MHz crystal for a microcontroller to emulate a controller.

But ultimately what matters is not the clock speed of the microcontroller but the speed (and stuff like framing) of the communication protocol between the microcontroller (which emulates a PSX controller) and the PSX. The PSX doesn't care what clock speed the microcontroller in the game controller runs at.

thanks for the incite, but i thought thought thats the reason that they have a clock line with the ps2 controller. let me know if im wrong but isnt it supposed to keep insync and if my microcontroller is running at a higher frequency then need wont it drop input in curtain cycles kind of like with frameskipping?

http://ece545.com/S15/reports/F13_PSX.pdf

ok i understand now the pins frequency has nothing to do with internal clock. i did i little more reading and i think what i talking about is the pwm frequency thats the digital lines?

That's an interesting document, and certainly has everything you'd need to devise and write Arduino software to run the controller. But it wouldn't be a trivial program by any means. They give an example with a MC68HC11, an ancestor to the PIC line. That microprocessor is not very similar to the microcontroller used in an Arduino, so you'd need to translate a lot of code.

Slowing the Arduino processor down ... that would not work.