It's best to start with video showing it at work:
Quite a while ago I posted another project showing 3 uECG devices working in EMG mode and controlling a robotic hand (https://www.hackster.io/aka3d6/robotic-hand-control-using-emg-349254) While it really did work and it was possible to replicate it, technology was, well, quite far from perfect to say the least, and it wasn't practically usable due to a set of problems (especially in radio connection part) - someone could make it work with a lot of efforts, but well, there are too many half-ready projects out there. :-)
We wanted to get back to it on the whole new level, with dedicated EMG device - and got quite far on that way, but still aren't there. So finally we decided to rework uECG device's firmware - and succeeded! Right now we are improving PC side software to make gesture/muscle activity easy to use in other projects: current version works quite well already, but is not friendly for modifications due to a lot of unused/poorly commented code, in coming weeks we'll fix that as well. So if you wanted to tackle something related to muscles - either from visual perspective, or by using it to control something - now you have a tool for that. In its simplest form, it can be used just to add some visual effects:
But if you will look closer at charts going in top right corner, you will see spectrum and processed EMG levels. All these data are fully available (after all, the whole project is open source / open hardware) - they are quite ready for processing with ML methods, and even simple threshold-based detection would give decent results.
For making some EMG-based project, you will need several uECG units (starting from 2 devices you can distinguish muscle groups responsible for different fingers, 4 units would give quite precise information) and one PC base station (that thing with USB). Base station also is used for wireless firmware update - and there will be a lot of new versions in coming months, as we'll catch bugs and add functions.
Placing the units
Units placement is not a simple question - as an above image illustrates, some muscles are right under the skin (indicated with blue color), some others lay deeper, beneath other muscle layers (green ones), and their positions, while generally similar for all humans, still aren't precisely located on every arm (and their skin projections shift as the arm moves!) - so the more precision you want, the more important and user-dependent placement becomes.
Still, for major finger movements even rough estimation works - in all my attempts I used old trusty "attach where it looks fit" approach without any measurements, and over different sessions I placed units in quite different spots - so I guess for low- and moderate-precision setups it shouldn't be a problem - just make sure that target muscle is close to the skin in a certain area and place the device somewhere around that. :-)
Processing the data
This part is actually under active development right now. While data look very promising, we want to create an API that would allow simple processing (I know from my own experience - modifying even a properly commented code is hard, and our code is far from being well commented) - and it's not ready yet. If you want some particular functionality, your comments would be greatly appreciated - we plan to release the first version within a couple of weeks, and when it's done - discuss in details what can be done with it, in a more technical and less demo post (as soon as it's ready, a link will appear in this place). :-)