23
Fitbit wants to predict depression: How much data is too much

  1. ro66i3 , via OnePlus 6T McLaren Edition , Jan 13, 2022 :
    Wore an Apple Watch for 3 years straight. Amazed by it for the 1st year with its "oh so cool" features & regarded it as the one true product that will unite the entire tech platform I'm on. It was actually exceptionally close. 2nd year it became an indispensable part of my life. Fitness, sleep, music, tv, notifications, calls, emails, calendar appointments & of course time tracking during WFH craziness. 3rd year I could feel the mental burn of the device & the wave of obsession that I was riding. It crept up the charts like Honey ji's crass songs.

    Sold it at 75% of purchase price & switched back to my high end chronos. I stay much more calm now & the pretty pretty traditional watch design with their delicate mechanical tick tocks makes me fall in love with them again like the day I bought them. A feeling I never got with the nearly similarly priced Apple Watch.
     

    #21
    PritishPriyam likes this.
  2. seancojr
    Cupcake Jan 13, 2022

    seancojr , via OnePlus 9 Pro , Jan 13, 2022 :
    I'm surprised to not have seen anyone mention Amazon's Halo fitness and health wearable, in terms of a company having gone this route before.
    Personally, I think Fitbit is asking to track too much for little in return to the consumer. Even if someone managed to discover things about their emotional wellness (assuming they're not already aware), these kinds of products cannot replace professional care.
     

    #22
    PritishPriyam and dsmonteiro like this.
  3. palc
    Ice Cream Sandwich Jan 13, 2022

    palc , via OnePlus Nord , Jan 13, 2022 :
    Very true! That's why there are psychologists/therapists to do such stuff - analyse the signs of depression and offer help/medication.

    However, in the context of the broad question, I believe there actually exists a 3rd option. Let the device listen, but the learning from it has to be local, i.e. on device, and no personal data should be exported out except for the learned parameters (or whatever your algorithm uses to recognise patterns).

    Think of it the following way:

    When you visit a mental health professional (psychologist/therapist, etc), you're confiding in them your personal data (opinions, thoughts, etc.) And they're bound to keep client confidentiality by not sharing your data outside.
    However, over the course of years of their practice, they themselves are actually a big data store of all their past experiences with other clients and they know what would work and what doesn't, i.e. learned data parameters. Also, they actually have a clause in which they can share client details in a general sense (maintaining client confidentiality) with other psychologists for help with their case or to seek expert opinion - sharing parameters.

    The following generally hold true with regard to mental health:

    1. In high income countries, mental health is becoming extremely inaccessible for the middle class due to the rising costs of therapy.

    2. In low income countries, mental health is almost in a negligence because the priority for wellbeing follows the following order: basic necessities for self > need fulfilment for people closeby > mental health for self. Only when one fulfils the first 2 is one able to think about their mental health. There is also the problem of awareness.

    3. In middle income countries, mental health is still in negligence because of awareness. Everyone thinks they can do it themselves when they actually can't. People neglect signs of depression or name it as simply being demotivation or as people from my home country would recognise - बहाना (feign).


    In all of these situations, personal devices that can recognise such patterns related to mental health, while preserving your privacy, actually makes a lot of sense. There are many forms of therapy that people can practice without consulting a mental health professional and those are actually known to improve different mental health conditions like depression, anxiety, etc.


    Now, if your smart devices could actually see early signs of these conditions and recommend proper techniques for you to follow all the way while implementing local learning, preserving your privacy, promoting awareness, saving you the money, and being accessible to millions - I believe it would be a game changer in terms of well being of the population at mass.

    What do you think @SJBoss @PritishPriyam ?
     

    #23
  4. pinkhelmet
    Gingerbread Jan 13, 2022

    pinkhelmet , Jan 13, 2022 :
    Idea sounds good. Of course a group of people with good intention could have come up with this but there will always be another group to somehow spoil things. I certainly do not want this. I mean, its a choice. It's ok if people would like to try this. I personally feel that a microphone listening to all my conversations is over the line.
     

    #24
    PritishPriyam likes this.
  5. G_plusone
    Nougat Jan 13, 2022


    #25
  6. PritishPriyam
    Jelly Bean Writers' Club Jan 13, 2022

    PritishPriyam , Jan 13, 2022 :
    I love your take on the topic.
    The general view of the society towards mental health is quite ignorant, unfortunately but still there's hope that things will change.
    There's no denying the fact that mental health is a domain that cannot be so easily touched upon and predicted.
    While the companies may be trying to help you, you most certainly cannot trust your data with them, and what you're suggesting seems quite difficult to achieve, honestly.

    Let us say that the device diagnoses that you're in Depression, then what are you going to do?
    The ultimate question then is is the device's diagnosis really helping you, if you're not gonna take any action on it?
    Go to a doctor and start therapy, you'd reply.
    But then you've stated the negligent attitude of the society towards mental health and the high costs associated with therapy.
     

    #27
    Grey_Cr0c and seancojr like this.
  7. Grey_Cr0c , via OnePlus 9 Pro T-Mobile , Jan 17, 2022 :
    So ok I see possibly see notifications but answer the phone go get a headset. Everyone does not need to here your call. Conveniency make laziness. Just My View...
     

    #28
  8. Grey_Cr0c , via OnePlus 9 Pro T-Mobile , Jan 17, 2022 :
    Okay I truly can't see how giving away more of my privacy is going to help if I'm depressed. I feel more depressed that my privacy is not mind every app wants me to give up my privacy or it won't work correctly. That is a shame we are force to this. It's not right. I am please to know ONEPLUS still listen to their consumers and produce awesome products. Which brought me to OnePlus. Please don't follow the trend, continue being the unicorn that pushes the limit and the supporters will continue to back you up.
     

    #29
  9. YRJ
    The Lab - OnePlus 7T Reviewer; Community Hero 2020 Community Expert Writers' Club Jan 18, 2022

    YRJ , Jan 18, 2022 :
    I use the watch only for receiving calls. The sound output however is the buds pro or bullets wireless.
     

    #30
  10. hotshot_shoty , via OnePlus Nord N10 5G T-Mobile , Jan 19, 2022 :
    I think it's about time , I have needed this for a couple of years. I suffered a mental breakdown in 2017. I made a huge mistake of getting involved with my first husband again the summer of 2018. I sure would be able to prove the mental anguish and torture I have had to endure for 730 days. I don't know how much more of this I can take right now. I'm so exhausted I just want to curl up in a nice fluffy bed with my little Doxie and sleep 730 days
     

    #31
  11. W1643328407694
    Cupcake Jan 28, 2022

    W1643328407694 , via OnePlus Nord 2 , Jan 28, 2022 :
    I use a smartwatch to monitor how much sleep I get. I'm a normal everyday person.
     

    #32

  12. #33
  13. I1652830936787
    Cupcake May 18, 2022 at 12:44 AM

    I1652830936787 , May 18, 2022 at 12:44 AM :
    Since they are already listening to us (and this is no conspiracy), I think this particular function will make no difference in the general pattern. Sure, it might be helpful to some people, but I can’t imagine it to be really effective.
     

    #34