just me

  • 0 Posts
  • 176 Comments
Joined 1 year ago
cake
Cake day: October 3rd, 2023

help-circle


  • same way placebo still works (to a degree) even when you know it’s placebo

    your subconcious is not logical, and no amount of conscious logic will fully defeat its influence

    to think yourself immune is foolish and dangerous, that’s when you allow it to work even better as you “logically” explain away every manipulation you were influenced by, and convince yourself you made a decision fully by yourself. The danger gets even hotter when it comes to political propaganda that uses the exact same tricks as marketing


  • not just cheaper though

    even subconsciously $15.55 will not be that better than $15.56

    but in a change from $20 to $19.99 the whole first number is smaller, and that gives our ape brains the feeling that it’s not as expensive

    to reveal the vibes your brain operates on, think about bigger numbers. Imagine yourself to be in kind of a rush, you want to buy something, but family is waiting, or you need to walk your dog, or maybe you’re doing shopping before work, regular life stuff,

    first scenario

    an identical item is sold for $2920 in the first store you visit, and for $2970 in the second store you visit. The stores are an inconvenient travel time away from each other. Do you go back to the first store?

    second scenario

    now, an identical item is sold for $2975 in the first store you visit, and for $3025 in the second store you visit. The stores are still an inconvenient travel time away from each other. Do you go back to the first store?

    though the difference is still $50, the jump from $2975 to $3025 feels more significant than $2920 to $2970. And obviously many of us will go back to get the cheaper option in both cases, but there’s a lot of people on this planet who have money to spare but not the time, and a lot of other circumstances too, marketing people know it and will do their damnest to sway you to buy their product


  • i’m baffled by people who use one drive, especially in its default setting where it saves all your shit in the cloud. Even if they have a perfectly stable Internet 24/7, do they not feel weird having a company keep their files on some drive they’ll never see? do they even know this is what’s happening?







  • honestly, this is not a terrible idea

    if you see someone at the verge of a panic attack that means they’re fully in their head spiraling - you can try to calm them down the normal way, but you can also try to force them out of their own head and ground them by saying something weird, ideally a question so their mind can latch onto it. It won’t always work, but it might shock them just the right amount to ground them!





  • shneancy@lemmy.worldtoTechnology@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    2
    ·
    16 days ago

    this is not about wanting this is about companies taking advantage of vulnerable people who should be grieving. This can cause lasting psychological harm

    you might as well be saying, if someone came to a drug maker, and wanted some heroine, and provided ingredients for heroine, and agreed to whatever costs were involved, isn’t that entirely their business?



  • shneancy@lemmy.worldtoTechnology@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    2
    ·
    16 days ago

    wow, so many reasons

    • to create a mimic of a person you must first destroy their privacy
    • after an AI has devoured all they’ve ever written or spoken on video it will then mimic such person very well, but most likely still be a legal property of a company that made it
    • in a situation like that you’d then have to pay a subscription to interact with the mimic (because god forbid you ever get actually sold something nowadays)

    now imagine having to pay to talk with a ghost of your loved one, a chatbot that sometimes allows you to forget that the actual person is gone, and makes all the moments where that illusion is broken all the more painful. A chatbot that denies you grief, and traps you in hell where you can talk with the person you lost, but never touch them, never feel them, never see them grow (or you could pay extra for the chatbot to attend new skill classes you could talk about :)).

    It would make grieving impossible and take constant advantage of those who “just want to say goodbye”. Grief is already hard as is, a wide spread mimicry of our dead ones would make it a psychological torture

    for more information watch a prediction of our future a fun sci-fi show called Black Mirror, specifically the episode titled Be Right Back (entire series is fully episodic you don’t need to watch from the start)