Detecting language using up to the first 30 seconds. Use `--language` to specify the language Detected language: English [00:00.000 --> 00:07.280] I'm Dave Champion, and welcome to the first podcast I've done in a very long time. [00:07.280 --> 00:12.580] I was podcasting before podcasting was a thing, probably, I don't know, 12, 13, 14 years ago, [00:12.580 --> 00:15.520] and podcasting was not really a thing back then. [00:15.520 --> 00:20.300] And so I switched out with the emergence of social media. [00:20.300 --> 00:24.620] I stopped podcasting because there wasn't much of an audience for it back then. [00:24.620 --> 00:31.160] And then I began to engage more frequently in social media, and that's been the direction [00:31.160 --> 00:33.440] I've had for the last decade. [00:33.440 --> 00:38.860] So why, after all these years, am I back podcasting? [00:38.860 --> 00:41.260] To put it in a word, Facebook. [00:41.260 --> 00:42.740] I'm going to get into that in a minute. [00:42.740 --> 00:51.180] So welcome to the arbitrarily numbered episode 1,000 of the Dr. Reality Dave Champion podcast. [00:51.180 --> 00:53.420] I say it's arbitrary, but it's kind of not. [00:53.420 --> 00:57.220] It is and isn't. [00:57.220 --> 01:03.220] Considering all the content I've put on social media of all kinds over the last 15 years, [01:03.220 --> 01:06.100] we're talking literally thousands upon thousands upon thousands. [01:06.100 --> 01:08.980] So a thousand is not a bad place to start. [01:08.980 --> 01:11.540] So welcome to episode 1,000. [01:11.540 --> 01:16.340] OK, so why did I start doing this again? [01:17.180 --> 01:24.100] I had made a fairly significant investment in terms of time and money and energy into [01:24.100 --> 01:25.100] Facebook. [01:25.100 --> 01:26.400] And by money, I mean advertising. [01:26.400 --> 01:28.820] Using Facebook as an advertising platform. [01:28.820 --> 01:40.180] So I put a lot of resources into that only to have Facebook turn into some crazy, you-can't-say-anything [01:41.180 --> 01:50.980] remotely, I don't even want to say controversial, that might perhaps rub somebody the wrong [01:50.980 --> 01:51.980] way. [01:51.980 --> 01:53.660] You can't do that on Facebook. [01:53.660 --> 01:54.660] And I'm not exaggerating. [01:54.660 --> 01:57.020] I'm going to give you an example of that in a few moments. [01:57.020 --> 02:02.780] But in my case, especially in the last year concerning the SARS-CoV-2 outbreak and my [02:02.780 --> 02:07.180] numerous videos that I've posted on Facebook concerning the science, the facts, the evidence, [02:07.180 --> 02:11.980] and so forth, yeah, Facebook moderators have really gotten a case of the ass burn with [02:11.980 --> 02:13.420] me about that. [02:13.420 --> 02:16.420] And the way they've handled that is to target me. [02:16.420 --> 02:21.040] In other words, this was literally a test done by some friends. [02:21.040 --> 02:23.860] I was suspended for a meme. [02:23.860 --> 02:28.500] And so a group of my friends then put the same meme on their pages. [02:28.500 --> 02:30.140] Not a peep out of Facebook. [02:30.140 --> 02:36.180] So because of my coverage of SARS-CoV-2 and the fact that I use facts, data, and evidence [02:36.300 --> 02:41.500] to show that more often than not, the establishment narrative is completely false, Facebook can't [02:41.500 --> 02:42.500] tolerate that. [02:42.500 --> 02:43.500] So I've been targeted. [02:43.500 --> 02:44.500] So that's one reason. [02:44.500 --> 02:47.340] But that's more of a commercial thing. [02:47.340 --> 02:52.580] So let's look at something personal to show you how insane the whole, I would call it [02:52.580 --> 02:55.940] Facebook moderation, but I don't even think that's an appropriate term anymore. [02:55.940 --> 02:56.940] I don't know what to call it. [02:56.940 --> 03:01.340] But I think moderation is far, far and away something different than what Facebook is [03:01.340 --> 03:02.340] doing. [03:02.340 --> 03:08.420] Here's an example of where Facebook, the insanity that has consumed Facebook in terms [03:08.420 --> 03:09.420] of moderation. [03:09.420 --> 03:12.820] A friend of mine on Facebook was chit chatting with a friend of hers on Facebook. [03:12.820 --> 03:14.380] Now these are friends. [03:14.380 --> 03:17.100] This is not an adversarial conversation. [03:17.100 --> 03:21.700] And the guy was saying, I'm tired of dating whores. [03:21.700 --> 03:24.700] Now I don't know what his definition of whores is. [03:24.700 --> 03:28.940] And so she responded and said, yeah, I'm tired of dating man whores. [03:28.940 --> 03:31.740] No, I don't know what her definition of man whore is. [03:31.740 --> 03:35.340] But this was a conversation, two people joshing back and forth. [03:35.340 --> 03:36.340] Right? [03:36.340 --> 03:37.340] No big deal. [03:37.340 --> 03:38.340] Nothing going on. [03:38.340 --> 03:39.340] Nobody being offended. [03:39.340 --> 03:43.340] It's all in good humor between people who know one another and they're having fun and [03:43.340 --> 03:49.020] conversing until she got suspended by Facebook. [03:49.020 --> 03:54.060] And I actually wrote this down because I wanted to make sure that I got this right for you. [03:54.060 --> 04:04.780] She was suspended because the language could make a person feel inferior. [04:04.780 --> 04:08.700] Oh, wow. [04:08.700 --> 04:13.020] The language could make a person feel inferior. [04:13.020 --> 04:23.340] So I think the main point I want to make about that particular criteria for suspending somebody [04:23.580 --> 04:26.100] is there is no line. [04:26.100 --> 04:29.300] Because when you set up something like community standards, there should be some discernible [04:29.300 --> 04:30.300] lines. [04:30.300 --> 04:33.380] So somebody sitting there doing their thing on Facebook and they're considering posting [04:33.380 --> 04:35.900] something and wait. [04:35.900 --> 04:39.580] Does that violate Facebook's community standards? [04:39.580 --> 04:42.780] There should be some way to know that. [04:42.780 --> 04:47.220] And perhaps because I've been targeted, I know that there isn't. [04:47.220 --> 04:53.140] The community standards that Facebook has put in place are so vague and so inscrutable [04:53.140 --> 04:59.020] that a moderator can suspend anybody for anything at any time by piling personal inference. [04:59.020 --> 05:00.340] Not what the poster meant. [05:00.340 --> 05:04.540] Not what thousands of people would construe that post to mean. [05:04.540 --> 05:14.180] But the moderator can say in their hubris, well, I say it means this, suspend. [05:14.180 --> 05:19.260] So that's how vague and inscrutable it is that any moderator can take down any post [05:19.260 --> 05:23.460] anytime and suspend the person for any reason. [05:23.460 --> 05:28.580] As far as I can see, there really is no supervision or proper oversight of moderators. [05:28.580 --> 05:30.420] They just do whatever they want. [05:30.420 --> 05:34.220] But back to the idea of you can make somebody feel inferior. [05:34.220 --> 05:36.780] Where is the line on that? [05:36.780 --> 05:40.060] There is no line on that. [05:40.060 --> 05:43.900] For instance, if I looked at that convert, if I was a moderator, God forbid, Jesus, can [05:43.900 --> 05:47.740] you imagine what petty little trivial pieces of crap take on the job of moderator? [05:47.740 --> 05:48.860] Oh my God. [05:48.860 --> 05:51.900] Talk about emotionally jacked up, screwed up people. [05:51.900 --> 05:56.580] But if I was to take up that job position, I would look at that and I would chuckle. [05:56.580 --> 05:59.540] I would go, here's two people having fun. [05:59.540 --> 06:07.140] The person who looks at that and says it could make someone feel inferior is themselves a [06:07.140 --> 06:11.180] person who feels inferior. [06:11.180 --> 06:15.140] Their virtue signaling using their employment. [06:15.140 --> 06:16.300] And it's disgusting. [06:16.860 --> 06:22.180] My view of Facebook generally in terms of its moderation, if you look at platforms like [06:22.180 --> 06:28.820] Twitter or YouTube, which is owned by Google, I'm not saying I approve of what they do, [06:28.820 --> 06:35.880] but at least their moderation has a purpose. [06:35.880 --> 06:38.740] It is to achieve an end. [06:38.740 --> 06:43.860] Now with Facebook, they have lost track of that. [06:43.860 --> 06:47.660] Moderation is not a step they take to achieve an end. [06:47.660 --> 06:50.020] Moderation simply is the end. [06:50.020 --> 06:54.300] And Zuckerberg's made a big mistake because what he's done is he's put somebody in charge [06:54.300 --> 07:02.940] of moderation who adores moderation for the sake of moderation. [07:02.940 --> 07:05.100] It's not achieving anything. [07:05.100 --> 07:08.380] As a matter of fact, it's driving people off the platform. [07:08.380 --> 07:14.300] So what Zuckerberg should do is take somebody like me, who believes that if you're going [07:14.300 --> 07:18.420] to engage in moderation, it should be for significant issues and significant issues [07:18.420 --> 07:19.980] only. [07:19.980 --> 07:23.300] Things like inciting violence, things like threatening other people, I'm going to come [07:23.300 --> 07:26.180] to your house and shoot you in the face, things like that. [07:26.180 --> 07:27.180] This is a problem. [07:27.180 --> 07:31.900] I think anybody who owns a platform, who manages a platform, would look at that and say, OK, [07:31.900 --> 07:33.660] this is an issue. [07:33.660 --> 07:38.300] So that's not who Zuckerberg put in charge, obviously, of the moderation. [07:38.300 --> 07:42.940] Zuckerberg has put somebody in charge who loves moderation for the sake of moderation. [07:42.940 --> 07:49.100] In other words, there is no limit when somebody says, OK, that's enough moderation. [07:49.100 --> 07:51.860] Who's ever in charge, that line doesn't exist. [07:51.860 --> 07:54.540] More and more and more and more and more and more moderation. [07:54.540 --> 08:02.560] And what we have when we look at that level of moderation is a wholesale, all-out attack [08:02.560 --> 08:04.060] on free speech. [08:04.900 --> 08:08.420] In the case of the man and woman going back and forth talking about, what Facebook has [08:08.420 --> 08:12.460] done is said, it doesn't matter what you're attempting to convey to one another. [08:12.460 --> 08:21.220] You will use Facebook-acceptable words or we'll take you off the platform, we'll suspend you. [08:21.220 --> 08:24.300] That's Orwellian shit right there, yes? [08:24.300 --> 08:31.780] So I'm looking at my years-long association, for lack of a better term, with Facebook. [08:31.780 --> 08:35.660] And I'm somebody who has a passion for personal liberty. [08:35.660 --> 08:43.700] And here I am with this relationship with this platform that is, in my opinion, oppressive, [08:43.700 --> 08:51.220] tyrannical, dictatorial, and stands against personal liberty. [08:51.220 --> 08:52.220] It's not moderating anymore. [08:52.220 --> 08:54.100] It is now attacking its own users. [08:54.100 --> 09:00.180] It is feeding on the people who provide the time, energy, and money for Facebook to profit. [09:00.180 --> 09:09.820] So I look at that relationship and I said, I'm starting to feel that this relationship [09:09.820 --> 09:15.460] is undermining my personal ethical structure. [09:15.460 --> 09:17.180] So I decided I'm going to step away from Facebook. [09:17.180 --> 09:18.820] Now that's not going to be an overnight process. [09:18.820 --> 09:24.940] I'm going to move away slowly and go on to various other social media platforms and podcasting [09:24.940 --> 09:26.060] being one of them. [09:27.060 --> 09:33.620] We talk about other social media sites, things like Gab, Parler, Telegram, Corner Tables. [09:33.620 --> 09:37.140] And I'll do a video or a podcast, I've got to reorient my brain. [09:37.140 --> 09:41.060] I'll do a podcast and I'll share with you where all of this content will appear on the [09:41.060 --> 09:42.060] various platforms. [09:42.060 --> 09:48.380] But for today, I wanted you to know why I am back in the podcast game again. [09:48.380 --> 09:53.140] So with that under our belt, I want to talk about the real issue why I'm here today. [09:53.140 --> 09:55.860] I wanted to share with you why I've started podcasting again. [09:55.980 --> 10:00.300] There's another issue I want to talk about that was just introducing why I'm back to [10:00.300 --> 10:01.300] podcasting. [10:01.300 --> 10:06.580] So I want to share with you a post that I placed on my personal Facebook page maybe [10:06.580 --> 10:08.260] a week ago. [10:08.260 --> 10:14.740] The last five years, especially the last election, have shown that a good percentage of the population [10:14.740 --> 10:17.940] are willing to buy into any stupid shit. [10:17.940 --> 10:21.300] By the way, I apologize for my computer making noise. [10:21.300 --> 10:24.660] I'm finding out the problems since I'm just coming back to podcasting, I'm finding out [10:24.660 --> 10:26.900] the problems having my Mac sitting here. [10:26.900 --> 10:29.620] So next time you won't hear any sounds, I apologize. [10:29.620 --> 10:31.860] Okay, so back to the text. [10:31.860 --> 10:38.500] I began to wonder whether people have always been this way or whether something has changed. [10:38.500 --> 10:42.020] I've been pondering that question for some time now. [10:42.020 --> 10:46.380] In terms of the question, I have been fortunate to have lived in both the pre-internet and [10:46.380 --> 10:48.020] internet worlds. [10:48.020 --> 10:53.140] In the pre-internet world, the establishment had nearly 100% control of what information [10:53.140 --> 10:57.020] the public would, but not could, access. [10:57.020 --> 11:02.980] In other words, the establishment didn't ban information, such as Facebook does. [11:02.980 --> 11:10.020] But the lazy masses only got what was spoon-fed to them by television and radio news, newspapers, [11:10.020 --> 11:11.540] and leading magazines. [11:11.540 --> 11:18.340] As a side note, it took me literally thousands and thousands of hours of research, much of [11:19.340 --> 11:24.340] to be able to present to you, with income tax shattering the myths, how the United States [11:24.340 --> 11:29.020] government has conned you, conned the vast majority of Americans into falsely believing [11:29.020 --> 11:31.380] they owe the income tax. [11:31.380 --> 11:32.500] The information was there. [11:32.500 --> 11:33.500] That's my point. [11:33.500 --> 11:38.380] The establishment didn't ban it, but it took me thousands and thousands of hours of research, [11:38.380 --> 11:43.500] and with assistance from scores of others, to finally put all the pieces of the puzzle [11:43.500 --> 11:44.500] together. [11:44.500 --> 11:46.020] So that's how the establishment's been doing things. [11:46.020 --> 11:49.300] So back to the post. [11:49.300 --> 11:54.820] With the advent of the internet, the establishment began to lose control over what information [11:54.820 --> 11:56.340] people consumed. [11:56.340 --> 12:00.780] People started to have the ability to research issues of interest for themselves from the [12:00.780 --> 12:03.300] comfort of their home or office. [12:03.300 --> 12:07.220] In my mind, this is the amazing benefit of the internet. [12:07.220 --> 12:12.060] You can learn almost anything while sitting in your living room. [12:12.060 --> 12:16.100] I am a prime example, having learned more things that mattered to me in the last ten [12:16.100 --> 12:22.180] years when damn near everything and anything became available on the web than in the entirety [12:22.180 --> 12:23.940] of my life before that. [12:23.940 --> 12:27.740] But as with anything, there is a dark side. [12:27.740 --> 12:31.460] Having virtually unlimited access to information doesn't mean the masses will use it as have [12:31.460 --> 12:35.860] I. My perspective is they haven't. [12:35.860 --> 12:39.980] I'm not sure I have the right words to describe how I see it. [12:39.980 --> 12:41.780] I'll give it a go. [12:41.780 --> 12:46.580] It seems to me the masses use the power of the internet not to increase their intellect [12:46.580 --> 12:50.660] with facts, evidence, and meaningful data, but rather to seek out whatever feeds their [12:50.660 --> 12:51.980] emotions. [12:51.980 --> 12:59.500] They are so not self-aware, they don't even see that what passes in their minds for knowledge [12:59.500 --> 13:03.680] is actually whatever bullshit they find that satisfies their emotions. [13:03.680 --> 13:10.620] They don't seek or value accuracy, correctness, or factualness. [13:10.620 --> 13:15.240] Because those would be restrictive to the subconscious need to accept only information [13:15.240 --> 13:17.380] that feeds their emotions. [13:17.380 --> 13:20.340] I see two possibilities for the future. [13:20.340 --> 13:22.380] There may be more that I can't envision. [13:22.380 --> 13:27.720] Number one, the continued need to use this newfound unfettered access to information [13:27.720 --> 13:33.660] to feed one's emotions, like a heroin addict uses heroin, is the timeless nature of mankind [13:33.820 --> 13:42.340] And so the gift, that would be unlimited information, will always be misused and abused by the masses. [13:42.340 --> 13:48.060] Number two, mankind was or is not mentally or emotionally yet prepared for access to [13:48.060 --> 13:52.260] virtually anything anyone creates, good, good, bad, or different. [13:52.260 --> 13:59.500] And so there is a maturation process or a period in which humans learn how to use the [13:59.500 --> 14:08.340] amazing gift of unlimited access to information for a higher purpose than emotional self-gratification. [14:08.340 --> 14:12.000] If it is number one, we are well-entrally fucked. [14:12.000 --> 14:15.700] And I won't get into that today, period. [14:15.700 --> 14:16.700] End of post. [14:16.700 --> 14:24.100] Okay, so first of all, having said all of that, the first thing that I want to say, [14:24.100 --> 14:29.380] and I realized this like the day after I wrote this, the first thing I want to say is that [14:29.380 --> 14:30.380] I'm full of shit. [14:30.380 --> 14:35.060] Okay, let me explain that comment, and I am, and I want to explain why. [14:35.060 --> 14:43.740] So first of all, I watched, or I'm sorry, I listened to a 2012 episode of my radio show [14:43.740 --> 14:46.060] that I used to do back then. [14:46.060 --> 14:51.440] And I found that in 2012, I was having the same exact dialogue with my audience about [14:51.440 --> 14:56.020] the fact that the American people don't appear to give a shit about facts, evidence, or data. [14:56.020 --> 15:00.660] They just want to feel the emotion right. [15:00.660 --> 15:03.060] They just want to be validated. [15:03.060 --> 15:09.560] They just want to feel like they fit in with their community, whatever community that might [15:09.560 --> 15:14.100] be, whether it's speaking politically for a moment, whether it's the left-leaning community [15:14.100 --> 15:16.420] or the right-leaning community. [15:16.420 --> 15:18.980] They just want to fit in. [15:19.060 --> 15:26.780] I've talked at length before about the fact that people define themselves by what positions [15:26.780 --> 15:32.540] they adopt, and they adopt the position that feeds that emotional need to define themselves [15:32.540 --> 15:33.780] in a particular way. [15:33.780 --> 15:39.380] They don't seek out the information and go, well, will you look at that? [15:39.380 --> 15:41.820] Will you look at that? [15:41.820 --> 15:45.780] I guess I should take this point of view because the data compels that. [15:45.780 --> 15:46.780] They don't do that. [15:47.420 --> 15:49.220] This is who I am. [15:49.220 --> 15:56.740] Oh, and this piece of data supports who I feel I am. [15:56.740 --> 15:58.020] So that's the first thing. [15:58.020 --> 16:00.780] I was talking about that back in 2012. [16:00.780 --> 16:05.420] So yeah, I'm full of shit when I say the last five years. [16:05.420 --> 16:07.220] That's not true. [16:07.220 --> 16:11.940] I literally, when I listen to the radio show, I was like, you're kidding me. [16:11.940 --> 16:16.860] I was talking about this very same dynamic nine years ago. [16:16.860 --> 16:18.660] I'd totally forgotten. [16:18.660 --> 16:27.340] So the other reason I'm full of shit is that when I said people do not care about facts [16:27.340 --> 16:36.260] or accuracy or correctness because that's an obstacle to them looking at the information [16:36.260 --> 16:41.740] that feeds them emotionally and that I have done it differently, I haven't done it differently. [16:41.860 --> 16:43.380] That's bullshit. [16:43.380 --> 16:49.020] I wrote what was on my mind at the time, and then upon reflection, I realized that's bullshit. [16:49.020 --> 16:58.220] I also seek out that which feeds my emotional construct, my perception of self. [16:58.220 --> 17:06.180] In other words, in that limited regard, I'm no different than all the people I was presumptuously [17:07.020 --> 17:12.660] criticizing, that I was presumptuously criticizing. [17:12.660 --> 17:16.380] In that narrow category, I am no different than they are. [17:16.380 --> 17:18.700] But there is a difference. [17:18.700 --> 17:20.820] And here's where we get into the difficulty. [17:20.820 --> 17:27.500] The difference is they define themselves in some way I admit I don't understand. [17:27.500 --> 17:28.700] Let me be very clear about that. [17:28.700 --> 17:30.940] I don't understand. [17:30.940 --> 17:31.940] Give you an example. [17:31.940 --> 17:37.700] I saw last night on Facebook a dialogue between someone I know is a fervent Trump supporter [17:37.700 --> 17:45.020] and a Facebook friend of hers, obviously a fervent Trump supporter, who was saying just [17:45.020 --> 17:52.460] yesterday in mid-March that Trump could still be president. [17:52.460 --> 17:54.420] That's some insane shit right there. [17:54.420 --> 17:56.820] And it wasn't talking about 2024. [17:56.820 --> 18:03.460] It was talking about this past November 2020 election, that Trump could still become president [18:03.460 --> 18:08.740] anyway even now in mid-March. [18:08.740 --> 18:12.060] So there's something about that person, I'm just using her as an example. [18:12.060 --> 18:18.980] There's something about that person that causes her to be emotionally fed, emotionally gratified [18:18.980 --> 18:22.780] by insane fucking shit, right? [18:22.820 --> 18:26.620] I get emotionally gratified on the other end of the spectrum. [18:26.620 --> 18:30.700] In other words, and I'm divorcing this from politics, I use the Trump example in that [18:30.700 --> 18:35.100] gal because it was something I saw last night and it's in my head. [18:35.100 --> 18:39.740] Leaving politics aside, whether we're talking about science, whether we are talking about [18:39.740 --> 18:43.540] politics, whether we're talking about current events, whether we're talking about interpersonal [18:43.540 --> 18:49.100] relations, no matter what we're talking about, what emotionally feeds me, that's that parallel [18:49.140 --> 18:58.980] to the other people I was pretentiously criticizing, what feeds me is factualness, accuracy, and [18:58.980 --> 19:00.780] correctness. [19:00.780 --> 19:04.260] That is how I define myself. [19:04.260 --> 19:11.100] So the process for me is no different than the process for the people that I was pretentiously [19:11.100 --> 19:12.780] criticizing. [19:12.780 --> 19:17.340] We both engage in the same process. [19:17.340 --> 19:22.660] So then, once we acknowledge that, or once I acknowledge that, then the conversation [19:22.660 --> 19:25.660] becomes why. [19:25.660 --> 19:37.900] Why am I emotionally gratified by knowing that when I speak, what I put out, or write, [19:37.900 --> 19:48.140] What I put out is factual, accurate, and correct to the best of my ability. [19:48.140 --> 19:55.340] And when others speak or write, they don't give a fuck about factualness, accuracy, or [19:55.340 --> 19:56.340] correctness. [19:56.340 --> 19:57.340] They don't care. [19:57.340 --> 20:05.540] I guess for the purpose of this podcast, why that distinction exists, I don't need to answer [20:05.540 --> 20:06.540] that. [20:07.180 --> 20:09.740] Frankly, I don't know that I can. [20:09.740 --> 20:15.500] One of the things that I enjoy, whether it's written posts, whether it's my books, whether [20:15.500 --> 20:20.900] it's videos, podcasts like this, one of the things that I think is really important is [20:20.900 --> 20:27.460] that, again, emotionally gratifies me is being transparent with you. [20:27.460 --> 20:35.220] So it's not challenging for me to say that I don't know, because I don't. [20:35.220 --> 20:39.660] I would rather sit here and be completely forthright with you and tell you, I don't [20:39.660 --> 20:43.740] have a fucking clue, than try and spin some BS. [20:43.740 --> 20:53.980] So I don't really know why a certain type of person, and sadly, if we were to talk about [20:53.980 --> 20:59.900] somebody I don't know, somebody who's seriously mentally ill, especially if that person being [20:59.900 --> 21:04.820] mentally ill manifests itself through criminal conduct, harming others, destroying property, [21:04.820 --> 21:08.260] and so forth, we can look at that and say, OK, that's bad. [21:08.260 --> 21:10.460] That's a problem. [21:10.460 --> 21:14.300] But somebody who's mentally ill to that extent is a very, very, very, very, very, very small [21:14.300 --> 21:16.820] part of the population. [21:16.820 --> 21:21.220] The disturbing part for me is what I'm describing here today, that certain people take emotional [21:21.220 --> 21:28.860] gratification from spewing absolute fucking bullshit. [21:29.620 --> 21:38.900] I have no explanation for, and it is a very sizable, in my view, percentage of the population. [21:38.900 --> 21:46.460] Obviously, I'm not out talking to tens of thousands of people face to face all the time, [21:46.460 --> 21:52.860] so my experience with this comes from primarily social media and all the various platforms [21:52.860 --> 21:55.260] I'm on and I visit and I participate in. [21:55.260 --> 22:01.420] So perhaps shit talking, not being factual, accurate, or correct, is a facet of social [22:01.420 --> 22:02.420] media. [22:02.420 --> 22:03.420] Perhaps. [22:03.420 --> 22:04.420] I don't think so. [22:04.420 --> 22:05.420] Let me tell you why I don't think so. [22:05.420 --> 22:08.940] I live in a small town called Pahrump, Nevada. [22:08.940 --> 22:11.220] I think the town is 34,000 people. [22:11.220 --> 22:12.860] Now I know some of you live in smaller towns. [22:12.860 --> 22:16.460] I come from Los Angeles, so this is a really small town. [22:16.460 --> 22:23.140] Back in 2016, I think it was 81% of the voters voted for Donald Trump, so that tells you [22:23.220 --> 22:30.500] what the predominant political framework or mindset is in this town. [22:30.500 --> 22:34.100] There are a lot of people in this town who no longer speak to me. [22:34.100 --> 22:37.540] I don't mean they're like, you know, they're like, oh, I don't talk to you, fuck him. [22:37.540 --> 22:39.380] They're not like that. [22:39.380 --> 22:46.500] But during the campaign for the election where Donald Trump was elected president, and during [22:46.500 --> 22:51.100] the four years he was president, I would sit down with people I'd known for years here [22:51.100 --> 22:52.100] in this town. [22:52.100 --> 22:55.540] I moved here 11 years ago as I'm sitting here talking to you now. [22:55.540 --> 22:57.820] So I moved here 11 years ago. [22:57.820 --> 22:58.820] Almost 12 years ago. [22:58.820 --> 22:59.820] That's scary. [22:59.820 --> 23:00.820] OK. [23:00.820 --> 23:04.020] So I knew a lot of people, and I would sit down and I would talk with them, and we'd [23:04.020 --> 23:08.660] visit in one of the stores, or we'd have lunch together. [23:08.660 --> 23:15.180] And I noticed over time that started to diminish. [23:15.180 --> 23:18.620] People would not want to stand and talk to me in a store. [23:18.620 --> 23:22.140] They would not say, hey, man, let's you and I get together, sit down and have lunch over [23:22.140 --> 23:24.380] at that restaurant that we enjoyed so much last time. [23:24.380 --> 23:27.740] I noticed over time that dwindled. [23:27.740 --> 23:31.180] And I know exactly why it dwindled. [23:31.180 --> 23:35.180] Because we would sit and talk or stand and talk if it was in a store. [23:35.180 --> 23:38.380] And they would say, well, this is going on, and that's going on, and this is going on, [23:38.380 --> 23:41.900] and the other is going on. [23:41.900 --> 23:47.460] None of it was factual, correct, or accurate. [23:47.460 --> 23:51.060] So I would very politely say, these are my neighbors. [23:51.060 --> 23:54.340] I would say, well, that's not exactly factual. [23:54.340 --> 23:58.100] Let me tell you what the facts are concerning this issue. [23:58.100 --> 23:59.700] This is a fact. [23:59.700 --> 24:01.200] This is a fact. [24:01.200 --> 24:02.340] These are facts. [24:02.340 --> 24:03.340] This is a fact. [24:03.340 --> 24:06.420] And you can go look up all these facts for yourself. [24:06.420 --> 24:12.900] So with respect, Bob or Sue or whoever I was speaking with, so respectfully, I don't know [24:12.900 --> 24:14.300] where you got that narrative. [24:14.540 --> 24:15.540] Yeah, I do. [24:15.540 --> 24:17.500] But again, these are my neighbors. [24:17.500 --> 24:19.020] I don't know where you got that narrative. [24:19.020 --> 24:26.420] But that narrative is nonfactual, incorrect, and inaccurate. [24:26.420 --> 24:32.180] And you can go verify all the information I just gave you, which is factual, correct, [24:32.180 --> 24:33.300] and accurate. [24:33.300 --> 24:38.660] So I just want you to know, as you go about town and you're talking to people, I just [24:38.660 --> 24:42.300] want you to know that that whole narrative you just spun for me, there's no facts or [24:42.300 --> 24:44.340] accuracy or correctness undergirding that. [24:44.340 --> 24:47.300] It's all just false narrative stuff. [24:47.300 --> 24:51.860] And the people that I was talking to are the very people I'm describing to you here today. [24:51.860 --> 25:00.540] They are people who don't want things that are factual, accurate, or correct, because [25:00.540 --> 25:04.860] that doesn't serve their purposes. [25:04.860 --> 25:11.860] Each one of them, as they sat there and spewed a false narrative at me, that empowered them. [25:12.020 --> 25:13.580] That fed them. [25:13.580 --> 25:14.700] They liked that. [25:14.700 --> 25:18.820] They felt like they knew stuff, and they were powerful, and they were going to share this [25:18.820 --> 25:20.660] with me and change the world. [25:20.660 --> 25:23.860] And it was all bullshit. [25:23.860 --> 25:30.660] So that experience here in my little town tells me it's not merely a social media phenomenon. [25:30.660 --> 25:34.020] It's just a human phenomenon. [25:34.060 --> 25:44.500] And again, I want to admit, I have no idea, no idea why anyone would want to present themselves [25:44.500 --> 25:45.500] like that. [25:45.500 --> 25:49.700] I see it day in and day out on social media, of course, with the SARS-CoV-2 outbreak. [25:49.700 --> 25:54.300] I haven't had as much face-to-face time with people, but obviously social media is bigger [25:54.300 --> 25:55.340] than ever. [25:55.340 --> 25:59.260] So I see people making those same kinds of comments, things that I used to hear sitting [25:59.260 --> 26:02.220] at lunch or standing in the store talking to my neighbors here in town. [26:02.300 --> 26:07.540] I hear the same kind of thing on social media, constantly. [26:07.540 --> 26:15.300] On my personal Facebook page, the people who participate regularly are very intelligent [26:15.300 --> 26:18.460] and very in tune and very fact-based. [26:18.460 --> 26:21.780] I actually have a policy on my personal, I don't do this on the Dr. Reality page because [26:21.780 --> 26:24.380] that's more of a big community scene. [26:24.380 --> 26:29.380] On my personal page, which is not a big community scene, it's like my living room, I have a [26:29.420 --> 26:30.980] policy. [26:30.980 --> 26:36.900] If you come on my page and you lie, you tell a lie, I'm going to admonish you. [26:36.900 --> 26:38.780] I'm going to say, number one, that's a lie. [26:38.780 --> 26:40.380] Here's what the real fact is. [26:40.380 --> 26:43.660] And I don't allow lies on my personal page. [26:43.660 --> 26:45.460] Do not lie. [26:45.460 --> 26:50.900] If you come on this page and lie, you'll get one, maybe two warnings. [26:50.900 --> 26:53.660] But if you come on this page and lie, I will remove you from this page. [26:53.660 --> 26:55.480] Now, my personal page is wide open. [26:55.480 --> 26:57.380] Anybody can go there and look at it. [26:57.380 --> 27:02.800] But by removing them, by unfriending them, it prevents them from posting lies, right? [27:02.800 --> 27:07.660] Because I find that kind of thing problematic. [27:07.660 --> 27:13.780] So problematic, I also, I might even get to the point where, and this is about me, it's [27:13.780 --> 27:14.780] not about them. [27:14.780 --> 27:17.220] I find it like insulting. [27:17.220 --> 27:24.600] You came on my page, you came into my living room, and you fucking lied. [27:24.600 --> 27:27.100] You sat in my living room, euphemistically speaking. [27:27.100 --> 27:32.100] You sat in my living room, talking to my friends, and you fucking lied. [27:32.100 --> 27:35.020] In my living room, who the fuck do you think you are? [27:35.020 --> 27:38.100] That's kind of like the emotional content I have when people come on my personal Facebook [27:38.100 --> 27:40.160] page and lie. [27:40.160 --> 27:50.720] So going back to the main theme, I'm no different than anybody else that the information that [27:50.720 --> 27:55.440] attracts me is information that emotionally gratifies me. [27:55.440 --> 28:01.120] The distinction turns out to be, after I acknowledge that part of what I said was full of shit, [28:01.120 --> 28:07.000] the distinction appears to be that what gratifies me is factualness, correctness, accuracy. [28:07.000 --> 28:15.120] And what gratifies others is something else, which I'm admitting for, I don't know what [28:15.120 --> 28:21.160] the sixth time, I have no clue what that is. [28:21.960 --> 28:31.840] I want to take a moment and suggest that if you find accuracy and correctness and factualness [28:31.840 --> 28:39.200] a positive thing, something you value in your life, may I please suggest you go to DrReality.News [28:39.200 --> 28:44.440] and pick up yourself a copy of Income Tax Shattering the Myths or Body Science or both. [28:44.440 --> 28:48.980] There's a news story that just came out today about income tax, and I'm going to come back [28:48.980 --> 28:52.300] and talk about that, and I'll talk more about Income Tax Shattering the Myths when I do [28:52.300 --> 28:54.800] that story because they're related topics. [28:54.800 --> 29:00.740] But I do want you to know Income Tax Shattering the Myths, 17 years, 17 years of in-depth [29:00.740 --> 29:06.100] research to finally not only connect all of the dots, but all of the pieces of the puzzle [29:06.100 --> 29:13.440] in place, but do so in a way that, because it's basically sort of a history law book. [29:13.440 --> 29:15.200] So obviously that can be intimidating. [29:15.200 --> 29:20.140] So it's put together in a way that every single person, I'm just going to say on the planet, [29:20.140 --> 29:25.020] but somebody in like, I don't know, India, probably has no little interest in US income [29:25.020 --> 29:28.940] tax law, so that every single American can understand. [29:28.940 --> 29:34.020] Now, people have read it because there is a high volume of information. [29:34.020 --> 29:38.700] They've had to read it a couple of times in order for it all to implant, but no one's [29:38.700 --> 29:40.060] ever read it and said, I don't get it. [29:40.060 --> 29:41.060] I don't understand. [29:41.060 --> 29:44.960] Oh my God, why doesn't the whole nation know this? [29:44.960 --> 29:45.960] Okay. [29:45.960 --> 29:46.960] And then body science. [29:46.960 --> 29:49.440] Yeah, you've probably heard me talk about this a ton of times. [29:49.440 --> 29:53.560] The establishment narrative on how human physiology operates is utter and complete bullshit. [29:53.560 --> 29:59.440] But it has allowed big pharma, big med, and big food to become multi-trillion dollar industries. [29:59.440 --> 30:05.960] And the reason that the scam from the government and researchers and pseudoscience continues [30:06.740 --> 30:13.140] absolute utter bullshit is to prop up and to continue big food, big med, and big pharma [30:13.140 --> 30:20.580] making trillions of dollars as America gets sicker and sicker and sicker, which it has, [30:20.580 --> 30:21.580] by the way. [30:21.580 --> 30:22.580] You can look at chronic disease. [30:22.580 --> 30:23.580] Wow. [30:23.580 --> 30:25.320] And it's all for money. [30:25.320 --> 30:31.640] You're being sacrificed on the altar of corporate profits for big pharma, big med, and big food. [30:31.640 --> 30:36.900] So if you want to know the truth about how your body's physiology actually operates, [30:36.900 --> 30:41.620] if you want to take this whole fake construct that's been built for you, sort of like the [30:41.620 --> 30:46.340] matrix, and you want to yank yourself out of that, yeah, get yourself a copy of body [30:46.340 --> 30:52.300] science, whether it's synchronicity, shattering the myths, or body science, you have my word [30:52.300 --> 30:54.820] that they will both blow your mind. [30:54.820 --> 30:59.460] What you choose to do once you understand what's in them, that's up to you, not me, [30:59.460 --> 31:01.020] but they will both blow your mind. [31:01.020 --> 31:07.520] So I want to thank you for being here on the number 1,000 episode of the Dr. Reality Dave [31:07.520 --> 31:10.680] Champion podcast, the first one in many, many years. [31:10.680 --> 31:14.760] I apologize if there's any mistakes, whether they're technical or the sound from my computer [31:14.760 --> 31:15.760] and so forth. [31:15.760 --> 31:17.840] I will trim that up as time goes on. [31:17.840 --> 31:19.120] So please do tune in again. [31:19.120 --> 31:19.620] Thanks.