Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Heart rate measurement accuracy #532

Closed
Riksu9000 opened this issue Jul 28, 2021 · 44 comments
Closed

Heart rate measurement accuracy #532

Riksu9000 opened this issue Jul 28, 2021 · 44 comments
Labels
enhancement Enhancement to an existing app/feature

Comments

@Riksu9000
Copy link
Contributor

Riksu9000 commented Jul 28, 2021

I have done some experiments with the heart rate sensor. I think one of the issue is that small movements while measuring can produce noise that could throw off the measurement. I don't think there's any code that detects noisy data and ignores it. The accelerometer could also be used to ignore data while there is movement. For some reason decreasing the gain improves accuracy for people. Why is that?

Setting the gain from 64 to 8 may have improved the results a bit. It worked at least as well as before.

Setting the LED current to 40mA didn't seem to make a difference.

The ambient light sensor could be used to detect if the watch isn't on the wrist, or if it's too loose, which could cause incorrect measurements as well.

Below you can see the SPL value graphed. This was with gain set to eight. The blinking light I added might actually be more accurate than the number shown, implying there might be an issue with the conversion to BPM. EDIT: Maybe its just HRV?

The data looks clean enough so it should be possible to get accurate measurements.

Here's the branch if anyone wants to test this.
https://github.com/Riksu9000/InfiniTime/tree/heartrate_test

VID_20210728_114837.mp4
@Riksu9000
Copy link
Contributor Author

I just read that the sensor actually gives better results with less pressure. I think everyones first instinct is to tighten the strap to get better contact, myself included. Perhaps this is why people are having issues with it, and if so, we should make it known to everyone. Maybe even add instructions to the app.

@JF002
Copy link
Collaborator

JF002 commented Jul 28, 2021

The algorithm for the HR measurement comes from [WaspOS]'https://github.com/daniel-thompson/wasp-os/blob/master/wasp/ppg.py). I ported it from Python to C++. When I talked to Daniel about it, he told me that it was quite simple and should/could probably be much improved :)

I'm certainly not a specialist in signal processing, so I'm glad someone takes the time to have a look at this!
And the graph with the SPL values on the background of the HR app looks really good!!

@hatmajster
Copy link
Contributor

Yes, very nice Riksu, I don't know how you guys do it here, but I would personally even merge this graph, its really nice to debug behavior of sensor :) Though maybe its feature creep in me;)
I've combined it with my own PR to change gain via options (#531) and was testing it a bit.
You say that 64x -> 8x may have improved things a bit. Well to me it fixes the problem - I don't expect the pine's sensor to be really accurate and there's probably much room for improvement - but current setting of x64 is just RNG:/ See for Yourself (its a bit badly cropped, but it should be visible, sorry). In the video I show how gain 8x, 4x and 64x works for me:

hrs.mp4

The left one is tomtom sports touch, some cheap oldish wristband. As You can see it probably has some filtering applied to bpm, because its too smooth. Or idk maybe just does things better, incrementally somehow.
I don't know what's the reason for such difference in opinions, when some folks claim that x64 is fine, and some (myself included) claim its terrible, maybe its the sensor itself, maybe something about blood pressure, etc. All I know is that to me x1-x8 is okayish, x64 is just lottery.
That's why I think it would be good idea to have a temporary gain customization (#531), just to let folks find their own favorite gain, and at some point pick one best in a survey or something like that. After this gain option will probably be not needed anymore. And besides maybe even by then InfiniTime will have some sophisticated approach to have even better sensor readings;)

@hatmajster
Copy link
Contributor

Oh and also, I've been testing the ambient light sensor, and it kinda worked in detecting if I had clock on my hand or not, but I wouldn't consider this feature as necessary... Though tomtom I showed in the video nicely detects if its on hand or not

@Riksu9000
Copy link
Contributor Author

That's interesting. There definitely isn't that big of a difference for me. I wonder why just gain would cause this. The gain is a hardware feature, so it should be better than digitally boosting the signal.

The driver sets some reserved bits in registers PDriver and Res (0x0c and 0x16), so I wonder if they do something unexpected on some units.
https://github.com/JF002/InfiniTime/blob/514481ef7f9c71ad816b31d979c6ab39ce9380dd/src/drivers/Hrs3300.cpp#L22

The 16 bit uint SPL value is converted to float, filtered, and converted to an 8 bit int. Do we know the result always fits in 8 bits? On the video we can see that there is some "calibration" happening, but this might be the filters doing their job. I hope I won't have to be the one to decipher what these numbers mean..

Ppg::Ppg()
  : hpf {0.87033078, -1.74066156, 0.87033078, -1.72377617, 0.75754694},
    agc {20, 0.971, 2},
    lpf {0.11595249, 0.23190498, 0.11595249, -0.72168143, 0.18549138} {
}

The bpm value should be possible to update more often, because currently only a buffer full of new and unused data is ever converted to a bpm. We can refresh it more often if we reuse some of the data. Also if we can detect an error in the measurement, we could force the buffer to be filled with new data before continuing the conversion.

The ambient light sensor should at least be used to turn off the measurement and stop the light from flashing.

Having the graph merged would be useful, but I feel it's too ugly in its current state at least.

@bearclaws8
Copy link

I did some testing with the sensor to see how values looked for different parts of my body to see if there were any correlations to getting odd/high values. Now, before you run away after I said "different parts of my body" it was all perfectly benign. I hadn't seen any info like this written here or on the forum so it may help with debugging. Here is what I noticed, and some thoughts:

  • Top of wrist, w/ hair, dry: erratic and frequently over 100 /under 50
  • Bottom of wrist, no hair, dry: stable and very close to actual
  • Palm/thumb, no hair, dry: stable and very close to actual
  • Ankle, no hair, dry: stable and close to actual
  • Top of wrist, w/ hair, wet: stable and close to actual
  • Other areas, no hair, wet: stable and close to actual

All of these were with the watch lose on my skin, like wearing the band comfortably. When wet it was after swimming in the pool, shower, and/or a work out. The more water retained between the sensor and skin the better.

So based on this, I think hair is part of the big problem. This isn't a new problem for these types of sensors and assume a few of you may already know this. Even light/fair hair for women or children seems to throw the sensor off. I guess that the light may be bouncing off of the hair and picked up as extra hits/noise in the sensor. This may be obvious to some but if this is a common cause of the issue then it may help with finding a solution for signal processing. I do not have a background in signal processing but it reminds me of doing CPK and SPC charts for manufacturing processes, maybe we can implement something like that here? (I would not know where to start in code... yet)

The current gain control seems to be reasonably accurate without hair in the way. At this time, if I want to use the HR sensor I simply roll the watch to the underside of my wrist and the values are accurate enough and stable. Maybe we put a band-aid in place for now with an animation suggesting the user to roll the watch to their forearm until a better signal processing is place (this is the "easy" solution a mechanical person comes up with☺️).

I did like the animation that @Riksu9000 showed in the top post. It was cool in my opinion and helped show that the monitor was actually doing something. I'm not sure if it was actually following the data, but something like that would be neat. Jut my $0.02.

@Avamander
Copy link
Collaborator

@bearclaws8 this PR will probably improve things for some people: #531

@Riksu9000
Copy link
Contributor Author

The current gain control seems to be reasonably accurate without hair in the way. At this time, if I want to use the HR sensor I simply roll the watch to the underside of my wrist and the values are accurate enough and stable. Maybe we put a band-aid in place for now with an animation suggesting the user to roll the watch to their forearm until a better signal processing is place (this is the "easy" solution a mechanical person comes up withrelaxed).

This sounds a bit silly, but honestly I kind of like this idea. This won't work for everyone though because I think some people want to leave it running and not just take a measurement sometimes.

The animation shows real data after preprocessing.

@DoctorAllcome
Copy link

Would it help if you had data like comparing heartbeat of PT and other devices (e.g. other wearables, consumer and professional blood pressure meters), taken simultaneously?

And if yes what could be a good schema? If I would make a file with BPM measured by PT in the first column and BPM measured by a comparing device in a second column would that be fine?
Could be enriched with information like the PT software version, the model of the comparing device, daytime, body activity (e.g. during sport, after sport, during rest, with fever). Of course all the later things optional so that people could spend only data they are comfortable with (e.n. with respect to privacy).

And how could we practically take the actual measurements of a value that is dynamic while maintaining consistency? How can we capture the values of two devices while correctly associate them with each other. For example writing down a value measured by PT and writing down the corresponding value of the other device and not an unrelated value 10 seconds later.

@bearclaws8
Copy link

This sounds a bit silly, but honestly I kind of like this idea. This won't work for everyone though because I think some people want to leave it running and not just take a measurement sometimes.

@Riksu9000 I am full of silly ideas, glad you like it.

The animation shows real data after preprocessing.

Thanks for the clarification. I wonder if it is smoothing the peaks a bit too much, which can make it seem "fake". I don't know enough coding to know how to stop it from smoothing the peaks but my gut tells me that may be the vidual disconnect.

@Riksu9000
Copy link
Contributor Author

Would it help if you had data like comparing heartbeat of PT and other devices (e.g. other wearables, consumer and professional blood pressure meters), taken simultaneously?

I'm not sure if it would help with this. From the graph we can see that when it is working correctly, the signal is quite clean after preprocessing and measuring the bpm is basically just counting the peaks. What would help is if the watch could detect when an error happens during measurement.

@bearclaws8
Copy link

Does anyone have some sample raw data that could be shared? I am not sure how to pull such data from my PT, and even less sure how to tweak code to help. But I could try fumbling about in processing data to see if an equation or logic can be identified to help flag when deviating.

@hatmajster
Copy link
Contributor

So I just shaved my wrist (its completely unnoticable with the watch) and well, for me it didn't help. I am a total ignorant in biology, but I think bottom of the wrist or finger are just are much closer to veins, than the top of the wrist - hence better measurements. But still maybe if someone wants to have correct measurements, its just the best approach to measure closer to the veins - bottom wrist. Its pretty simple and clever @bearclaws8 ;)

This might interest You: I am doing my own testing utilizing FitoTrack, but my old phone keeps betraying me when I run, loses GPS and breaks the whole training data, so I don't have much to show yet - I just noticed for now that 8x gain seems to be better than 4x gain while running. Also, maybe related to blood pressure, but it seems like I can't get the watch too tight while running - the measurements are always quite okay. But can't back it up with any data yet.

FitoTrack naturally shows parsed, not raw data (and I suspect it even tries to fix stuff itself a little). But here's what You can do to get raw measurements - using #560 You can log all readings in real time, or store them in file and later print with console. I want to do it myself, but currently don't have time...

@bearclaws8
Copy link

So I just shaved my wrist

@hatmajster that is some serious dedication to a project! And thank you for the suggestion for pulling raw data. I too will have to find the time to learn how to use this and then actually use it

@danielbarry
Copy link

I'm also seeing issues with the readings, specifically when my heart rate is greatly increased. Currently it appears as though when the measurement is above ~160 BPM it caps out and goes back down ~50/60 BPM, which is literally impossible whilst running.

Lower heart rate values appear to work much better. I am happy to share data if somebody can tell me how to record it.

Has anybody been able to test this against a known good device, like a Garmin watch?

@Avamander
Copy link
Collaborator

Avamander commented Sep 4, 2021

Please avoid spending too much time on investigating and comparing before/without #531

@daniel-thompson
Copy link

I've only just been pointed at this ticket.

To progress algorithms for HR detection on PineTime what is really needed is a corpus of test data to test algorithms against. In other words a large collection of 24Hz 10 second samples of raw sensor data without the filtering to normalize the sensor output to 8-bit samples.

However even before that I think an initial focus should be on maximizing the signal quality of the raw data and finding suitable sensor settings is worth it. Regarding the original gain settings, before I developed any algorithms at all I did an exhaustive search of every possible sensor mode and selected the settings that, with that set of data, looks the least bad when graphed. Note that I really do mean "least bad", none of the raw signals were much good (on many of them you couldn't visually see any heart activity at all).

@daniel-thompson
Copy link

The 16 bit uint SPL value is converted to float, filtered, and converted to an 8 bit int.
Do we know the result always fits in 8 bits?

Yes. This is what the threshold argument to the Automatic Gain Control (AGC) is designed to normalize the signal to +-100

Ppg::Ppg()
: hpf {0.87033078, -1.74066156, 0.87033078, -1.72377617, 0.75754694},

This is a high-pass filter with a cut of frequency set around the lowest possible human heart rate (I forget exactly what but 40bpm sounds likely). It goes first because PPG sensors have massive drift in the zero point and a high-pass filter cuts out this low frequency drift. It is also the first step of de-noising.

agc {20, 0.971, 2},

This is a peak tracking gain filter that figures out what gain level the sensor has and maps it's output into a -100 to 100 range. The peak tracking is damped (second argument is the rate of damping) to avoid big spikes from confusing things. To avoid problems with the damped peak tracking there is also a threshold to discard samples that are way out of range (this threshold is ultimately what ensures everything fits into a single 8-bit value).

lpf {0.11595249, 0.23190498, 0.11595249, -0.72168143, 0.18549138}

This is a low-pass filter to with a cut of frequency set around the highest possible human heart rate (200bpm?). It provide a final bit of denoising before we attempt to actually figure out the heart rate.

As far as I recall both the filters were designed using the scipy filter design functions. To develop the current algorithm I just loaded all the test data I had into a python notebook and then kept tweaking the filtering algorithm until I got the highest percentage of correct detection. Note that as part of this I did some experiments using higher order filters but they didn't offer much over simple biquads.

All that builds to the advice in my previous post. It's a good idea to go back to eyeballing the raw data and optimizing the sensor settings before spending too much time on the HR detection algorithm. Based on those early graphs that I had I strong suspect that many of the problems reported so far as consequence of the garbage-in/garbage-out principle!

The bpm value should be possible to update more often, because currently only a buffer full of new
and unused data is ever converted to a bpm. We can refresh it more often if we reuse some of the
data. Also if we can detect an error in the measurement, we could force the buffer to be filled with
new data before continuing the conversion.

+1. Running the detection more often is a very good idea (wasp-os needs a couple of extra features before that can be enabled but I doubt those issues affect Infinitime).

Having the graph merged would be useful, but I feel it's too ugly in its current state at least.

+100 ;-) . IMHO a graph is the only way to determine if the data coming from the sensor actually has a detectable heart rate in it (e.g. is the sensor actually giving good data or is it just noise).

PS wasp-os does have a raw sensor capture mode (and can xfer the results over BLE) if anyone does want to go back to graphing raw data looking for better settings!

@danielbarry
Copy link

@daniel-thompson However even before that I think an initial focus should be on maximizing the signal quality of the raw data and finding suitable sensor settings is worth it. Regarding the original gain settings, before I developed any algorithms at all I did an exhaustive search of every possible sensor mode and selected the settings that, with that set of data, looks the least bad when graphed. Note that I really do mean "least bad", none of the raw signals were much good (on many of them you couldn't visually see any heart activity at all).

It would probably be good to expose these settings in an 'advanced' menu in Infinitime. I remember looking at the HR sensor some time back and you're just basically sending some different register settings when you enable it?

I wonder whether it might be possible to 'calibrate' the settings per user by 'scanning' through the different register combinations and looking for one that produces the least noisy data? My guess is that different gains, etc, are going to work for different people in different ways.

@daniel-thompson This is a high-pass filter with a cut of frequency set around the lowest possible human heart rate (I forget exactly what but 40bpm sounds likely).

Can this be adjusted? This is right on the borderline of my resting (40-50). That might explain some of the crazy readings I keep getting.

@daniel-thompson This is a low-pass filter to with a cut of frequency set around the highest possible human heart rate (200bpm?)

I see cited in many places that this is typically '220 bpm - age' [1]. I wouldn't expect much higher than 220 anyway.

[1] https://goodcalculators.com/heart-rate-calculator/

@daniel-thompson
Copy link

@daniel-thompson This is a high-pass filter with a cut of frequency set around the lowest possible human heart rate (I forget exactly what but 40bpm sounds likely).

Can this be adjusted? This is right on the borderline of my resting (40-50). That might explain some of the crazy readings I keep getting.

It is possible to redesign the filters (just drop in new coefficients). In fact it's even possible to calculate the coefficients on-the-fly in reaction to preferences by adding in the filter design algorithms. Same goes for the LPF (although now you mention it it might 220 does ring a vague bell, it might already be using that... I wrote this code over a well year ago so I'd have to dig up my notebooks to dig out more detail about the choice of coefficients). Note that having said it is possible to design filters on-the-fly that doesn't mean I think it is a good idea. Personally I don't like adding preferences to anything unless their necessity have been proven (e.g. the problem is real and there's not a better way to solve the problem the preference is introduced to paper over ;-) ).

Personally I would be surprised if the HPF causes problems for you. The roll off is fairly gradual so I think it's more likely that the data quality is low and that the algorithm is struggling for that reason! In short my own thoughts on best way forward remain as before... best for initial focus to be on getting best data possible from the sensor and then to build up a corpus of test data to use to design better algorithms (e.g. raw samples for a slow resting heart rate would be very useful).

@danielbarry
Copy link

@daniel-thompson Personally I don't like adding preferences to anything unless their necessity have been proven (e.g. the problem is real and there's not a better way to solve the problem the preference is introduced to paper over ;-) ).

If the effort is low I can't see any harm, especially if there is some question as to whether the current values are correct. This could be hidden away for access by 'advanced' users only (although I imagine all PineTime users are some kind of hacker at this point).

@daniel-thompson Personally I would be surprised if the HPF causes problems for you. The roll off is fairly gradual so I think it's more likely that the data quality is low and that the algorithm is struggling for that reason! In short my own thoughts on best way forward remain as before... best for initial focus to be on getting best data possible from the sensor and then to build up a corpus of test data to use to design better algorithms (e.g. raw samples for a slow resting heart rate would be very useful).

Sure, happy to help collect the data, I drive my watch daily currently. How would you know that the data is of good quality though - and what the algorithm is supposed to generate? Surely you need some kind of baseline?

@daniel-thompson
Copy link

@daniel-thompson Personally I don't like adding preferences to anything unless their necessity have been proven (e.g. the problem is real and there's not a better way to solve the problem the preference is introduced to paper over ;-) ).

If the effort is low I can't see any harm, especially if there is some question as to whether the current values are correct. This could be hidden away for access by 'advanced' users only (although I imagine all PineTime users are some kind of hacker at this point).

The usual essay that comes up in this one by Havoc Pennington: https://ometer.com/preferences.html .

Applied to this situation it is mostly about when adding preferences keeps people from fixing real bugs. There is no need to venerate my original decision to go for x64 gain (it was based on eyeballing 50ish plots of raw data, shortlisting the ones where all three graphs for that mode showed a good heart rate before finally selecting the lowest LED power and the highest gain), if we can show lower gains are beneficial then it is much better to change the defaults than to add a Preference that most users won't understand the consequences of changing.

@daniel-thompson Personally I would be surprised if the HPF causes problems for you. The roll off is fairly gradual so I think it's more likely that the data quality is low and that the algorithm is struggling for that reason! In short my own thoughts on best way forward remain as before... best for initial focus to be on getting best data possible from the sensor and then to build up a corpus of test data to use to design better algorithms (e.g. raw samples for a slow resting heart rate would be very useful).

Sure, happy to help collect the data, I drive my watch daily currently. How would you know that the data is of good quality though - and what the algorithm is supposed to generate? Surely you need some kind of baseline?

Raw data means the stream of 16-bit samples coming from the HR sensor (e.g. before it is injected into the algorithm). Therefore I wouldn't worry about using an algorithm to assess the data quality until we've gone as far as we can with eyeballing graphs of data. Ideally we would have a tool that automatically explores the sensor settings and captures multiple sets of data in each mode. I guess I could put together something for wasp-os that does this fairly quickly: it can just be a trivial Python script running on the REPL-over-BT ;-) . I think I would prefer to let someone else write that tool for Infinitime though...

@danielbarry
Copy link

The usual essay that comes up in this one by Havoc Pennington: https://ometer.com/preferences.html .

Sure, but this is about testing the effect of sensor values on measurement quality. I fully expect these settings could get ripped out in the future. I imagine different skin types, watch tightness, watch hardware, etc, will definitely affect gain value requirements.

Ideally we would have a tool that automatically explores the sensor settings and captures multiple sets of data in each mode.

Is it that clear in the graphs? Are you able to share some? It's hard for me to imagine how easy it would be algorithmically without seeing the data. Hopefully it can be done onboard the watch firmware where the user just lets it calibrate for a few minutes and the ideal values are set in the config for them.

@daniel-thompson
Copy link

The usual essay that comes up in this one by Havoc Pennington: https://ometer.com/preferences.html .

Sure, but this is about testing the effect of sensor values on measurement quality. I fully expect these settings could get ripped out in the future. I imagine different skin types, watch tightness, watch hardware, etc, will definitely affect gain value requirements.

I think that argues towards having developer modes that can be used to run experiments rather than a user preference. Also I still maintain the developer mode needs to include the capability to capture raw samples and exflitrate them for analysis. Trying to do the analysis on the watch based on preferences will be very slow going.

Ideally we would have a tool that automatically explores the sensor settings and captures multiple sets of data in each mode.

Is it that clear in the graphs? Are you able to share some? It's hard for me to imagine how easy it would be algorithmically without seeing the data. Hopefully it can be done onboard the watch firmware where the user just lets it calibrate for a few minutes and the ideal values are set in the config for them.

This is a reasonably clean set of raw samples where the heart beat is pretty clear:
image

And this is another set gathered at pretty much the same time that is much harder to interpret (although I believe the current algorithm actually does a pretty good job with this data). Note that both samples come from (more or less) the same time, from the same subject, no movement by test subject, no repositioning of watch, same HRS3300 sensor settings, etc.):

image

One key thing to observe is that we've got a lot of headroom with this subject (e.g. the high gain isn't yielding any kind of clipping)

@bearclaws8
Copy link

@daniel-thompson, how does one get this type of data off of the PineTime? Preferably using InfiniTime and preferably using a sealed unit. I think using a sealed unit may provide more realistic user data than from a dev unit. Although I'm not sure how to even gather data from a dev unit. Is there anything in the Wiki on pulling data? (If not, maybe one of us can add it once we sort it out)

I agree that having a developer mode would be quite useful.

@danielbarry
Copy link

@daniel-thompson Thank you for sharing the data. I see what you mean now about the filtering - and I can see that automating settings could also be a pain. I've seen some smart watches offer gain as a setting in any case (such as an old Lenovo HW01), maybe it's a good idea to expose this?

In my mind a 'developer mode' is just simply some additional exposed settings that are not easily accessed by the user. I think that doing several different builds is potentially a mistake and could lead towards debugging madness. I think Android's "tap the version a few times to access developer mode" type thing could be done in a simpler way.

@daniel-thompson
Copy link

@bearclaws8 said:

how does one get this type of data off of the PineTime? Preferably using InfiniTime and preferably using a sealed unit.

@JF002 mentioned it a bit further up but it might be worth repeating that I founded and maintain wasp-os so I'm mostly joining in this ticket because the Infinitime code is a trans-literation of the wasp-os heart rate code (which I wrote). Other than this I've not been involved much with Infinitime.

In wasp-os we can log this sort of information to the filesystem and wasp-os comes with (admittedly very primitive) file transfer tools to copy the logs back to the host over bluetooth (e.g. so it is absolutely fine for sealed units). I think I still have the python scripts lying around that can do this (again, the primitive file transfer allows us to copy the script over bluetooth and run it). Infinitime has recently added a filesystem to that gives a good place to store logs although I don't know if there a file transfer tools yet.

@danielbarry said:

I've seen some smart watches offer gain as a setting in any case (such as an old Lenovo HW01), maybe it's a good idea to expose this?

I'm old fashioned enough to think a decision to expose such a setting should be evidence based. Ultimately a setting should not be added just because nobody has the confidence to say "x64 doesn't work for most users, let's change the gain to x8". That is the essence of everything I have trying to say about preferences. Preferences are not intrinsically bad but it is a terrible idea to add Preferences because maintainers are not confident enough to do the right thing instead of the stupid thing and they decide to compromise by doing both (and its even worse if the default setting for the Preference is the stupid thing... the "Please unbreak my app" preference in Havoc's essay).

As mentioned above, I'm just the wasp-os guy tresspassing in @JF002 's domain! I've shared my view (multiple times) that the best way to move forward it to write tools to capture raw data to support decision making. I will try to gather the wasp-os version of these tools into a nice form and, unless anyone objects, will let people know when I have that. However other than that I think I should step aside for a bit and leave space for Infinitime coders to comment.

@danielbarry
Copy link

@daniel-thompson I'm old fashioned enough to think a decision to expose such a setting should be evidence based. Ultimately a setting should not be added just because nobody has the confidence to say "x64 doesn't work for most users, let's change the gain to x8".

Sure I understand your point. I believe there is reason to think that the default settings do not work for everybody and there may not even be default settings that blanket work for all.

Just earlier I took my PineTime for a walk, I manually measured my heart rate as 60/70 and the watch was suggesting 150/160. What I would have liked to have done at the time is manually switch the gain to see what effect it has. Even better would have been to collect the data itself (although I am not sure about the current progress of storing data on a sealed watched).

@daniel-thompson I will try to gather the wasp-os version of these tools into a nice form and, unless anyone objects, will let people know when I have that. However other than that I think I should step aside for a bit and leave space for Infinitime coders to comment.

I think everybody here appreciates your point of view and comments, especially regarding your own tests and code.

@Avamander Avamander added the enhancement Enhancement to an existing app/feature label Oct 2, 2021
@ajack2001my ajack2001my mentioned this issue Dec 13, 2021
1 task
@ruzko
Copy link

ruzko commented Feb 3, 2022

Has there been any progress on this issue?
I'm resting now, with the PT running v1.8 on top of my wrist with a loose band. HR is mostly stable at ~50, but skips around to 70 and 130 from time to time.

@JF002
Copy link
Collaborator

JF002 commented Feb 4, 2022

@ruzko We merged #531 a few days ago (it's available in develop, but not released yet). This PR changed the default value for the gain to x8, as many users reported better results we that default setting.

As mentioned above, I'm just the wasp-os guy tresspassing in @JF002 's domain! [...] However other than that I think I should step aside for a bit and leave space for Infinitime coders to comment.

@daniel-thompson I'm sorry I couldn't follow this conversation more closely at that time! You're always more than welcome here, and I value your opinion and knowledge :) The next move is probably to implement an easy way to fetch raw data from the sensor so people who master signal processing and filter can figure a better algorithm out!

@mc0e
Copy link

mc0e commented Apr 13, 2022

I'm just learning about both heart rate monitoring and the Pine Time because I am currently experiencing issues with how my own heart rate is behaving. Still, I've learnt enough to see some issues with some of what's being said in the comments on this page.

I think it's worth asking what the intended scope is for the PineTime's usefulness. If it is only intended to be useful for the majority of users, with typical healthy hearts, for things like fitness use, then just counting the peaks might be adequate, and it's great that this is available. I'd hope though that the PineTime might also be useful to people who want to monitor their heart rate because it is not performing healthily. Such users are obvious candidates for wanting to monitor their heart rate, and it would be nice if affordable privacy-respecting open source options were available to them also.

I haven't gone into depth with the actual algorithms actually being used, but comments in this thread have included assumptions about heart rate range and about the wave form seen in the data from the PineTime's sensor which won't cover all users.

At the low end of heart rates, [3] refers to a healthy heart rate of 27bpm, and pathological cases in the 20-40 range. I have a neighbour with bradycardia who regularly sees 30bpm. At the upper end, [4] refers to a pathological but non fatal heart rate of 600, though it's apparently very unusual to find more than the low 300s, due to the absolute refractory period of the AV junction. Presumably that's higher in infants.

I'm also seeing comments here about the "peak tracking gain filter". Simply "measuring the peaks" is likely to produce errors.
For most people, most of the time, photoplethysmography wave forms will look close to a saw tooth wave form with a fast incline and a slower decline, but that isn't always the case. What we want is to measure the rate of systolic peaks, but we expect the sensor will fairly commonly show a smaller secondary peak in each cycle, and in some cases it's not even smaller.

I've found details of the PineTime's sensor at [1], and I presume that "PPG" in that document refers to Photoplethysmography, which is the technology that's also found in a fingertip pulse oximeter. Some technical detail on reading PPG wave forms can be found at [2], and in particular take a look at figure 19.2 in that paper. Also, do an image search for "Photoplethysmography dichrotic notch" to see a range of other observed wave forms.

For most people, most of the time, PPG wave forms will look like close to a saw tooth wave form with a fast incline and a slower decline, but that isn't always the case. What we want is to measure the rate of systolic peaks, but we expect the sensor will fairly commonly show a smaller secondary peak in each cycle, and in some cases it's not even smaller.

There's detailed literature out there about interpreting PPG wave forms. I don't know what the processing power limits are in the PineTime, but it's probably possible to do a pretty decent job of this.

Probably this should be a new issue, but for now I'll put it up here.

[1] "HRS3300 Heart Rate Sensor" http://files.pine64.org/doc/datasheet/pinetime/HRS3300%20Heart%20Rate%20Sensor.pdf

[2] "Photoplethysmography: Analysis of the Pulse Oximeter Waveform" http://dx.doi.org/10.1007/978-1-4614-8557-5_19
(with full text at https://www.researchgate.net/profile/Kirk-Shelley/publication/278716052_Photoplethysmography_Analysis_of_the_Pulse_Oximeter_Waveform/links/559edc5708ae03c44a5cdba0/Photoplethysmography-Analysis-of-the-Pulse-Oximeter-Waveform.pdf)

[3] Bradycardia page on Wikipedia https://en.wikipedia.org/wiki/Bradycardia

[4] "Mouse Heart Rate in a Human: Diagnostic Mystery of an Extreme Tachyarrhythmia" https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3273956/

@JF002
Copy link
Collaborator

JF002 commented Apr 13, 2022

@mc0e Thanks for all these detailed information and links!
I have to admit I have very few knowledge in digital signal processing and in the algorithm needed to process data from the heart rate sensor.
In the current state, the heart rate sensor is functional, displays relatively sensible information, but is far from being a medical device one could use to diagnose or monitor medical conditions.
I think the heart rate sensor (hrs3300) is quite basic, and the documentation provides very few details on how to set it up and how to process the data it provides. A closed source library that processes the data and provides heart rate value is available but... it's closed-source, unfortunately (and also needs quite a lot of RAM iirc).

I'd hope though that the PineTime might also be useful to people who want to monitor their heart rate because it is not performing healthily. Such users are obvious candidates for wanting to monitor their heart rate, and it would be nice if affordable privacy-respecting open source options were available to them also.

I totally agree with you. However, the number of developers with the sufficient knowledge to implement such algorithms and DSP is probably very quite limited. Let's hope someone willing to implement them will show up at some point :)

@mc0e
Copy link

mc0e commented Apr 14, 2022

Another data point on the range of heart raters that might be reasonable to look for is that the fingertip pulse oximeter I use (A heartsure A320) measures from 30-250 BPM.

https://www.manualslib.com/manual/1260715/Heart-Sure-A320.html?page=14#manual

@daniel-thompson
Copy link

Firstly, let me say that I don't particularly want to launch a defense of the current algorithm (which I wrote). It was written over just a few evenings based on very limited test data. Moreover the quality of the data coming of the sensor (as shown in the graphs above) was really bad... much worse than the PPG charts I saw in the research papers that I reviewed (not sure it it was those @mc0e highlighted... but I certainly did read some of the research). As a result simply selected the most noise-resilient approach I could think off and tuned the resulting algorithm to give the maximum number of good results from the test data I had.

If you are interested in pursuing this then the best thing that can be done to improve the situation would be to start finding ways to crowd source lots additional test data off the PineTime sensor in order to allow algorithms to be developed and tested. This means data capture annotated with an estimated heart rate, organised, anonymized and perhaps uploaded to a github project. I don't really have the energy to do the organizing for this but I'm happy to offer advice and encouragement (maybe even a bit of mentoring on the technical aspects) to anyone who does!


As said above I don't think launching a defense of the current algorithm is helpful (because the point about crowd sourcing test data is way more important). However I do want to make sure the current algorithm is correctly understood since it that might help people figure out improvements (or at least replicate what it is good at if they design replacements).

Firstly the high and low-pass filters are low-order filters and have a fairly gentle roll-off (filters with stronger roll-offs actually resulted in worse algorithm performance). That is why they can be set to cover a fairly conservative range: the "cut-off" frequency does not mean all signals above/below the cut-off are discarded (although they are attenuated somewhat).

Secondly the peak tracking filter is not responsible calculating the heart rate... it simply handles the automatic gain control. AGC help normalize the data a bit for later processing. By design the AGC guarantees that the data is always in the range -100 to 100 (e.g. it fits into a single byte to make it cheap to store in the tiny RAM of the watch and it also fits onto a watch display that is 240 pixels high in order to make it easy to graph). Whilst perhaps the high- and low-pass filters could be better tuned I'm actually pretty happy with the behavior of the current code up to this point. It does a good job of cleaning up the data into something we can work from and being able to graph the intermediate results on the watch is super helpful.

The first few elements clean up the data. After that we get to the actual heart rate code. This works by looking for auto-correlation in the signal data (currently between 210 and 30 bpm). Auto-correlation is jargon that means the code relies on the fact that two heart cycles look similar to (correlate with) one another. This approach has some great properties: is highly noise resilient and should already cope extremely very well with double peaks in a heart cycle. However this algorithm does have problems. It is not great for very fast heart rates[1] and it doesn't like irregular heart rhythms (it cannot even see them... having something that can detect heart rate variance was the goal of many of the papers I was originally reading).

[1] Note that the sensor can only samples at max 25Hz so there aren't many samples per heart cycle at high heart rates... that leaves little to correlate with.

@mc0e
Copy link

mc0e commented Apr 14, 2022

If you are interested in pursuing this then the best thing that can be done to improve the situation would be to start finding ways to crowd source lots additional test data off the PineTime sensor in order to allow algorithms to be developed and tested. This means data capture annotated with an estimated heart rate, organised, anonymized and perhaps uploaded to a github project.

To some extent we're customising to the Pinewatch sensor and measuring environment, and testing whether we're actually getting the desired results with that data, in which case we'd need a data set that reflects that environment.

If this approach goes too far though, we could be replicating prior research efforts. Are there relevant and already existing corpora, and are they similar enough after initial band-pass filtering that we can use them?

@daniel-thompson
Copy link

If this approach goes too far though, we could be replicating prior research efforts.

I don't think we need worry about going too far any time soon! As far as I am aware, every data driven decision made in this area so far was based on a small set of samples taken from a single subject at rest.

Even the decision to switch to x8 gain was based purely on anecdotal feedback rather being data driven (because there was no data collected to be driven by).

@mc0e
Copy link

mc0e commented Apr 14, 2022

Thanks for that @daniel-thompson. It already answers a number of questions I had.

Yes, I'd like to pick your brain a bit about this, initially to better understand how manageable a project this is for me. I have relevant skills, but not in embedded development. What's the best way to contact you?

@daniel-thompson
Copy link

What's the best way to contact you?

I'm danielt in both the Pinetime Development chat room and the #wasp-os IRC channel. If you go looking in the github history for wasp-os then you'll be able to pick out an e-mail address too.

@stevensu1838
Copy link

Hi all,

First, I've noticed the heart rate reading from the Pine Time watch sometimes goes up to 160 for no reason. Do you have any solutions?
Secondly, may I please ask if you think the accuracy of the heart rate measurement is reliable? I am choosing between Pine Time watch and Garmin watch for heart rate reading for research purpose. The advantage of Pine Time is open source and free for data streaming. However, It is charged for live data streaming with Garmin watches. Can you experts share your insights.

@daniel-thompson
Copy link

First, I've noticed the heart rate reading from the Pine Time watch sometimes goes up to 160 for no reason. Do you have any solutions?

There's some great work being done (not by me) on wasp-os/wasp-os#363 to better understand the HRS3300 sensor use on these systems. I think it is already obvious from their work so far that, even with no changes to the HRS algorithm, the system would be better if the HRS3300 settings were changed to double the LED power, reduce the ADC range by 1 bit and set the intra-sample wait time to 0. These changes will substantially increase the hardware sampling rate without changing the effective sample depth (since when I looked at raw data from my devices, doubling the LED power more or less compensates for the lost ADC range).

Secondly, may I please ask if you think the accuracy of the heart rate measurement is reliable? I am choosing between Pine Time watch and Garmin watch for heart rate reading for research purpose. The advantage of Pine Time is open source and free for data streaming. However, It is charged for live data streaming with Garmin watches. Can you experts share your insights.

I guess the most significant advantage with the PineTime for research purposes is that you don't have to stream the heart rates at all! Instead you coiuld stream raw PPG samples and process them offline using software that has some existing academic credibility (HeartPy for example).

@bakeromso
Copy link

I have very little experience in the field myself, but what are your thoughts of training a model to determine the heart rate instead of manually doing the filtering?

What I had in mind is collecting datasets of the heart rate sensor of the PineTime and simultaneously collecting a dataset from a more accurate sensor. If this more accurate sensor would be resilient to movement, we could even include the accelerometer from the PineTime in the dataset so that it can train on determining the heart rate while exercising.

I am not sure what the best way would be to extract the data from the PineTime. I was thinking the data could be timestamped so that it can more easily be compared with the more accurate dataset (given that is also timestamped). What this more accurate sensor would be is also open for debate. Some pulse sensor like https://www.sparkfun.com/products/15219? Or an ECC like https://learn.sparkfun.com/tutorials/ad8232-heart-rate-monitor-hookup-guide/all?

Again, I have no experience in this field of data science. However, I am very eager to learn how viable you guys think such an approach can be.

@khimaros
Copy link

maybe this issue can be resolved after #1486 lands in a release version?

@JF002
Copy link
Collaborator

JF002 commented May 18, 2023

Yep, I hope those changes will improve the HR measurement speed and accuracy!

@FintasticMan
Copy link
Member

Closing, now that #1486 has been merged.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement Enhancement to an existing app/feature
Projects
None yet
Development

No branches or pull requests