Try this schematic: Find any small NPN transistor, a 100K to about 470K resistor (220K), a capacitor >= 1uF to 47uF (polarized capacitor at 10V min rating is ok), a 4.7K resistor, (1/4 or 1/8 watt ok for both resistors).
Connect one side of the around 220K ohm resistor to the collector of the NPN transitor.
Connect one side of the 4.7K resistor to +5 V.
Connect the other side of the about 220k resistor to the base of the NPN transistor and to the + connection of the capacitor.
Connect the other side of the 4.7K resistor to the collector of the NPN transistor and to the input Arduino pin.
Connect the negative lead of the capacitor to the guitar output, the shield of the connector/cable to the analog Arduimo circuit ground.
This circuit should give you a amplified and yet clipped and optimally distorted signal to an analog input Arduino pin, which is nice, so you just can count the time in milliseconds or microseconds between sequential peaks (maximal A/D readings) of guitar signal voltages after A/D conversion. This will give you the period (1/Freq) of the frequency (tone) of the most dominant vibration of one or more strings. Then, use this frequency to generate MIDI tones. Code to analyze the A/D results and discard short(harmonics) or seemingly random pulse width readings(Plucking the string). But use the time readings between consistent larger duration peaks...this will give you the period of the tone played.
Start a timer at the detection of the first peak near +5V, stop the timer at the detection of the second maximal peak. Take 5 or more samples. Discard readings too short or long relative to the actual tone ranges. Take several readings and use comparisons to find which periods that make up the majority of the samples, those readings to use are those that match somewhat closely...this is the statistical true period of the tone played. Only a few A/D readings should quickly give you the tone period info you need.