General Approach - Bitmath heavy?

Hello.

I have made a "binary clock" as a learning project of my own design.

It uses some TPIC6C595N ICs that current sink the LEDs of 6 10LED bar graphs. Not all LEDs are used!

The number of LEDs/bits per "bar" is as follows:
Month = 4
Day = 5
Hours = 5
Minutes = 6
Seconds = 6
Seconds/64 = 6

Giving a total of 32 bits (which I kinda designed for...seemed like a nice number at the time).

The issue is I want to get the time every say 10 seconds from the DS1307 RTC and then shift the time out to the LED bar graphs.

For example if the time was xx:xx:10 (hh:mm:ss) then Id want to shift "001010" to the LEDs for the seconds.

The issue is as follows:
The LEDs are not "in order". I.e.

The first 4 bits that get shifted out light up the seconds/64 first 4 bits, then the next 4 bits shifted out are on the Hours leds...it is all a bit "mixed up".

Essentially, I need to shift out 4 bytes in this order:

[Month3,Month4,Minute2,Minute1,Minute3,Minute4,Minute5,Minute6]= byte 1
[Day4, Day5,Month1,Month2,Seconds1,Seconds2,Seconds3,Seconds4] = byte 2
[Hour4, Day1 ,Day2, Day3, Seconds5, Seconds6, Seconds/64-1, Seconds/64-2] = byte 3
[Hour 1, Hour 2, Hour 3, Hour 4, Seconds/64-3, Seconds/64-4, Seconds/64-5,Seconds/64-6] = byte 4

So I thought I could use some LUTs of "longs" that are 32 bits and use some form of bit masking?
Is this a decent approach...or is there better?

Thanks!