I understand the problem of bigger battery capacity’s and therefore it wouldn’t be fair to judge the manufacture by a percentage of charge per time cause a lower percentage value could still mean there is more energy in the battery.
So generally your test doesn’t make much sense cause if my battery is charged in 1 hour or 3 hours is irrelevant as long as I don’t know how much runtime I get per charge time.
But your unit mA/m (milliamps per minute) confuses me more than helps.
What should mA/m (milliamps per minute) be?
Amperes are charges per second so we have current twice per time?
I think you want to get a value how much energy a charging adapter can deliver to the battery per time. But that wouldn’t have much force of expression, cause for a small amount of time there can always be transferred much energy especially in the first 15 minutes where the battery is cold, you have to take into account the full charging cycle. Aside from that the output energy of the charging adapter doesn’t have to be stored into chemical energy it can dissipate in heat from the battery.
It’s difficult to measure how much energy is flown (and stored) into the battery.
But if you want, you should be aware that measuring the current (Amperes) isn’t sufficient. The energy is composed out of the current, the voltage at which the current flows and the time.