How to condition a 100mV to 10V signal to 5V for Digital Input use

I have an input signal that ranges between 0 and 10V DC. If the signal is lower than 100mV I would like that to be read as a LOW by a 5V digital input. If the signal is 100mV (ideally but could be a bit higher if needed) or higher (up to max 10V) then I would like that to be read as a HIGH.

In other words, I am looking for something that will output a 5V signal when the input is anywhere between 100mV and 10V. The signal varies rarely and is static.

Is there a simple way to achieve this?

In case you want to know the application...

I have 2 LED light fixtures (one over my electronics workbench and the other pointed at the ceiling for diffused lighting), and that are controlled by Qubino 0-10V zwave dimmers. The light fixtures become super dim when at 0V but do not turn off. This requires the additional complication of putting these lights behind zwave switches (I can't use LED lights that can be dimmed with a zwave dimmer as the ones I found are way worse than these 'commercial' 4000K 5000 lumens ones I picked). The idea was to turn on a relay that powers the LED's driver when the dimmer output is 100mV or higher. If it ever dips below, it would turn off the relay after a delay.

I was planning on prototyping this with a regular Arduino and then just get a chip like the one on the UNO or just use the Nano. I have a power source inside the light fixture, and there is plenty of space for additional electronics.

You need an opamp configured as a comparator. Put the reference voltage on the negative input and your 0 to 10V on the positive input. The output will switch as the 0 to 10V signal goes above or below the reference voltage.

Search for comparator.

I would use a 2:1 voltage divider and either the analog comparator on the Arduino, set at a 50 mV trip point, or use the analog input to read the voltage at the divider output, and make decisions accordingly.