NC_2025_01_12
In this episode of NoCellicast, we recap CES 2025, spotlighting innovative accessibility products like robotic emotional support animals and smart glasses, while inviting audience feedback.
Automatic Shownotes
Chapters
0:00
NC_2025_01_12
1:25
CES 2025 - How We Cover the Show
10:55
Tiny Mac Tips 4 Video Tutorial on ScreenCastsONLINE
11:41
CES 2025: Tombot Robotic Puppies for Those Facing Health Adversities
17:21
CES 2025: Soliddd - Vision Correction for Macular Degeneration
28:33
CES 2025: Monar Canvas Speaker Blends Art, Music, and Technology
30:48
Support the Show
31:29
CCATP #806 — Andrea Jones-Rooy on Being a Data Scientist
Long Summary
In this episode of the NoCellicast, I delve into the exciting whirlwind of activity surrounding CES 2025 alongside my husband, Steve. We share insights about our experience navigating the event, highlighting our unique approach to showcasing innovative technology. This year, we focused on smaller vendors and unique products that resonate with our audience, diverging from mainstream media narratives that often spotlight larger brands like massive TVs. Our interviews were primarily gathered from three exclusive pre-show press events: Unveiled, Pepcom, and Showstoppers, along with the addition of a new event called Everything Tech, spotlighting 15 to 20 intriguing companies.
We're particularly thrilled to discuss our tour, sponsored by the CTA Foundation, which aimed to connect seniors and people with disabilities to technologies that significantly enhance their lives. This initiative highlighted several accessibility-focused innovations, and while we couldn't feature all five companies in our interviews, we managed to bring you four standout products that embody this mission. Over the course of CES, we achieved a personal record by recording 57 interviews, roughly six hours of content that we're distributing judiciously over the coming weeks to ensure a manageable flow, allowing time for our regular programming.
To provide clarity on our production process, I dive into the behind-the-scenes details of how we capture and edit content. Steve utilizes professional-grade equipment, including a Panasonic 4K video camera, to record our interviews with high audio quality. Once back at our hotel, he downloads and labels the footage to facilitate our internal Airtable database, which helps streamline our workflow. Each day, I enter data from business cards collected from various vendors, while Steve fills in additional details after we return home. Our post-production employs Final Cut Pro to finalize visually appealing videos for our audience, ensuring that the content is effortlessly accessible on our website and YouTube channel.
In a bid to enhance viewer experience, we are also experimenting with adding transcripts and closed captions to our videos. This initiative caters to both segments of our audience—those who prefer video and others who appreciate written content. I share the technical details of how we process audio using Auphonic software to generate transcripts, ensuring that everyone can access our discussions regardless of their viewing preferences.
Additionally, our conversations take a fascinating turn as we discuss standout products we encountered at CES. One highlight includes an interactive robotic emotional support animal, developed to assist those with conditions like dementia. We also explore innovative smart glasses designed to aid individuals suffering from macular degeneration, utilizing cutting-edge technology that restores functional vision. Furthermore, I introduce a unique product that combines art and sound in the form of a smart canvas speaker, accentuating the creative possibilities within technology.
As we wrap up the episode, I express my excitement about upcoming content and the introduction of valuable insights on data science and personal development with special guest Andrea Jones-Roy. Andrea shares her expertise in data science while weaving in elements of humor and performance, highlighting the intersections between technical proficiency and engaging communication. In reflecting on our experiences at CES, I emphasize the importance of maintaining a curious mindset and being open to the myriad ways technology can enhance our lives.
Finally, I invite our audience to engage with the show through feedback and community discussions. Listeners can reach out via email, join our Slack community, or support our work through Patreon or one-time donations. We will continue to provide fresh, insightful content as we explore the exciting ways technology evolves to meet our needs.
We're particularly thrilled to discuss our tour, sponsored by the CTA Foundation, which aimed to connect seniors and people with disabilities to technologies that significantly enhance their lives. This initiative highlighted several accessibility-focused innovations, and while we couldn't feature all five companies in our interviews, we managed to bring you four standout products that embody this mission. Over the course of CES, we achieved a personal record by recording 57 interviews, roughly six hours of content that we're distributing judiciously over the coming weeks to ensure a manageable flow, allowing time for our regular programming.
To provide clarity on our production process, I dive into the behind-the-scenes details of how we capture and edit content. Steve utilizes professional-grade equipment, including a Panasonic 4K video camera, to record our interviews with high audio quality. Once back at our hotel, he downloads and labels the footage to facilitate our internal Airtable database, which helps streamline our workflow. Each day, I enter data from business cards collected from various vendors, while Steve fills in additional details after we return home. Our post-production employs Final Cut Pro to finalize visually appealing videos for our audience, ensuring that the content is effortlessly accessible on our website and YouTube channel.
In a bid to enhance viewer experience, we are also experimenting with adding transcripts and closed captions to our videos. This initiative caters to both segments of our audience—those who prefer video and others who appreciate written content. I share the technical details of how we process audio using Auphonic software to generate transcripts, ensuring that everyone can access our discussions regardless of their viewing preferences.
Additionally, our conversations take a fascinating turn as we discuss standout products we encountered at CES. One highlight includes an interactive robotic emotional support animal, developed to assist those with conditions like dementia. We also explore innovative smart glasses designed to aid individuals suffering from macular degeneration, utilizing cutting-edge technology that restores functional vision. Furthermore, I introduce a unique product that combines art and sound in the form of a smart canvas speaker, accentuating the creative possibilities within technology.
As we wrap up the episode, I express my excitement about upcoming content and the introduction of valuable insights on data science and personal development with special guest Andrea Jones-Roy. Andrea shares her expertise in data science while weaving in elements of humor and performance, highlighting the intersections between technical proficiency and engaging communication. In reflecting on our experiences at CES, I emphasize the importance of maintaining a curious mindset and being open to the myriad ways technology can enhance our lives.
Finally, I invite our audience to engage with the show through feedback and community discussions. Listeners can reach out via email, join our Slack community, or support our work through Patreon or one-time donations. We will continue to provide fresh, insightful content as we explore the exciting ways technology evolves to meet our needs.
Brief Summary
In this episode of NoCellicast, I join my husband, Steve, to recap our experience at CES 2025, focusing on innovative technologies from smaller vendors. We share insights from 57 interviews captured at various pre-show events, highlighting accessibility-focused products for seniors and individuals with disabilities. Our production process is unwrapped, detailing how we ensure high-quality audio and video content. We discuss standout innovations like an interactive robotic emotional support animal and smart glasses for macular degeneration. I also preview an upcoming chat with data science expert Andrea Jones-Roy and invite audience engagement through feedback and community discussions.
Tags
NoCellicast
CES 2025
innovative technologies
smaller vendors
accessibility
seniors
disabilities
audio
video content
emotional support animal
smart glasses
data science expert
community discussions
Transcript
[0:00]
NC_2025_01_12
[0:00]Hi, this is Allison Sheridan of the NoCellicast podcast, hosted at Podfeet.com, a technology geek podcast with an ever-so-slight Apple bias. Today is Sunday, January 12, 2025, and this is show number 1027. We've been getting a lot of people asking if Steve and I are okay in Los Angeles with all of the fires. I do want to assure you that we're fine for now. The fires are about 12 miles away from us, so we're not super worried about it. But I do want to say that our hearts go out to everyone who's been affected by this horrible, horrible tragedy. Before we get started, if you were looking for the standalone version of Chit Chat Across the Pond with Bart from last week, when he talked about what it's like to be a cybersecurity specialist, it is now available in your podcatcher of choice. In all the excitement and work to get ready for CES, somehow I forgot to push it to the feed. Since I forgot to push Bart's episode last week, my automation gave me the wrong episode for this week's Chit Chat Across the Pond with Andrea Jones-Roy. Both episodes say in the audio that they're number 805. I've now labeled them correctly in the titles, but I wanted to clear up any confusion that may have created by that little mistake. You will, of course, hear Andrea this week, and it is a fantastic interview. Now, since you're listening to the NoCellicast, you may never have noticed this mistake, but I personally needed the closure.
[1:25]
CES 2025 - How We Cover the Show
[1:25]Well, Steve and I are back from CES, and as always, it was a whirlwind of activity. Now, unlike most people, I actually get fewer steps in when I go to CES than I do when I'm at home. Tells you what a beast I am on the exercise front, right? Well, our brains are absolutely packed full with information. We've been making a few little mistakes here and there, like I mentioned earlier, So we'll get things going, I think, pretty quickly. And when Steve and I go to CES, it's a bit different than you might expect if you watch the big coverage of the show. You know, while the big outlets talk about massive TVs, we like to seek out smaller vendors with either products we know will be relevant to the audience or maybe simply unique and interesting. Also, the weird ones always catch my eye. And we get these interviews primarily from three press events that start even before CES ever begins. While the massive convention halls open on Tuesday, we arrive on Saturday afternoon in Las Vegas. We like to take Sunday to visit friends like Sandy Foster, my wingwoman, in the live chat room.
[2:25]In addition to getting to see Sandy, we go to a lot of the events with Dave Hamilton and Pete Harmon of the Mac Geek Gab, Chuck Joyner, and Norbert Frasa of Mac Voices. We also had dinner with J.F. Brissett, who's my editor at Screencast Online. It was so great to catch up with all of them. A special bonus prize this year was that Bodhi Grimm of the Kilo-Op podcast was there, and I got to be on Daily Tech News Show with him, Tom Merritt, and Rob Dunwood, along with Anthony Lemos, their producer. It was so great to see all of these wonderful friends.
[2:55]Now, Sunday, Monday, and Tuesday nights, we go to the press events called Unveiled, Pepcom, and Showstoppers. These are each in large ballrooms with maybe like 150 or so vendors that are handpicked and pay extra to be there. This year, we got into a fourth press event, which was much smaller. It was called Everything Tech. It only had about 15 to 20 vendors, but nearly every one of them had something interesting to offer. Now, Steve Ewell is the director of the CTA Foundation. They have a mission to link seniors and people with disabilities with technologies that enhance their lives.
[3:28]The CTA Foundation is a public national foundation affiliated with the Consumer Technology Association, CTA, and that's the people who put on CES. So Steve also happens to be a Nosilla Castaway. Because I know a guy, we were able to go on a tour with special presentations by five companies focusing on accessibility that the CTA Foundation chose to sponsor for CES 2025. Now, we didn't end up interviewing all of them, but I think we chose four of the five to bring them to you because they look like terrific ideas. Over the course of four days of CES, Steve and I recorded a grand total of 57 interviews. This is a new personal record for us. At roughly, let's say they're six minutes a piece, that's nearly six hours of content. Now, the good news is we are not going to inundate you with CES interviews for six straight hours. Instead, we like to push out the content maybe three or four videos a week. That kind of keeps it manageable for us, mostly Steve, and it leaves me breathing room to bring you more of the normal no-silicast content you've come to know and love. As a little inside baseball, I'd like to describe the process of bringing you this content.
[4:36]To record the content for you, Steve uses his Big Boy video camera. It's a Panasonic 4K camera mounted on a monopod. On the cold chew of the camera, he has an Audio-Technica receiver that takes the audio signal from a wireless handheld Audio-Technica microphone that I use to record both me and the interviewee.
[4:55]Every day when we get back to the hotel room, he downloads the videos to his MacBook Pro in the morning so we have a backup. We have an Airtable database I created years ago, and that lets us record all of the information from every one of the interviews. With Airtable, you can create an input form to populate the database instead of having to deal with, you know, a hundred fields and a gigantic kind of spreadsheet-looking view. I created a form that has just the relevant fields for the business cards I collect. While Steve is downloading and labeling the videos on his laptop, I busily enter all the cards from the previous evening's recordings. Now, it's important that I do these first thing in the morning. If I procrastinate at all, I'll be looking at a business card that says something like VIVU, and I'll have no idea what the product was. But if I do it quickly enough, we can usually remember what each of them were. Now, when we get home, that's where Steve does, I'd say, at least 80% of the work, and I've got him more in a support role. He fills in a lot more fields in the air table, like the duration of the videos, the play order as we choose them, and more. For each of the, say, three to four videos per week, he opens them in Final Cut Pro on his Mac Studio.
[6:02]His first step is to export the audio of the recording to an uncompressed AIFF, and that's sent to my Mac so I can play it for you on the podcast. Once the original audio has been captured, that's when he starts working on the video. He tops and tails it, as John McAllister would say, by adding the music and video graphics he created for the show years ago. He creates a lower third with the name of the person I interviewed and their company name, taking all of that from the Airtable database.
[6:29]He then uploads the video to YouTube, and he creates the information you see about the video explaining the product. After he has the video posted on YouTube, he then embeds that video on a podfee.com blog post, so you don't have to go to YouTube to hunt for it. But of course, if you're already on YouTube, you can find it there. He also writes all of the promotional messages for social media, and finally, he updates the database in Airtable to show that he is posted to all of those services.
[6:55]At this point, you might be wondering whether I do anything at all for the first few months of the year. It kind of reminds me of when my father discovered that Steve does 90% of the cooking for Thanksgiving, and he said, what do you do around here anyway? Well, anyway, yeah, he was a great guy. But anyway, part of the magic of using a database to track all of this is that we can create different views of the same data. There are a lot of fields I don't care about in the database, so I hide all of those from the Allison view. I also created a view called Allison Needs Action, and in there I was able to set up some conditional situations. If any of the following conditions are not met, the record for that interview will still show for me in this Allison Needs Action view. If Podfeet hasn't been added from the dropdown to Mastodon or Threads, or Allison hasn't reposted in her personal Facebook account, then I'll see the record. When everything is complete, the last step I do is email the person I interviewed and their marketing or PR rep to tell them the video is available. Of course, I have a text expander snippet to create those emails, and I just kind of fill in the blanks and add any little flavor I want to to the particularly fun interviews. So the last step in the Airtable database is a checkbox to show that I've sent the email. As soon as I finish that last social media or email step in the database, the record disappears from Allison needs action, and I can tell that I'm done.
[8:15]To be perfectly honest, going through this part of it, I actually have trouble keeping up with Steve because he's doing so much so quickly. Now we're testing out an improvement for CES 2025, and that's to add both a transcript and closed captions to the videos. My thought was that by having so much of the content be video and audio with no real blog post, I'm leaving out the part of the audience who loved to read instead. I also wondered whether we could have a good closed caption set for the videos as well. The details of how we're doing the transcripts and closed captions are going to be in a standalone blog post and article that I'll read to you, but I'll give you a little bit of the flavor of what we're doing. With the NoCillaCast, Chit Chat Across the Pond, and Programming by Stealth, I run the entire audio file through the Auphonic interface, which does a lot of things to process the audio. But as a bonus prize, it also gives me a full transcript of the podcast that I link to in the show notes. It generates a summary and chapter marks for the shows that don't already have them.
[9:12]I don't promise any work has been done on this file, so you get it for free, and maybe it's great, maybe it's not. But for these short videos, and maybe for the short interviews, like say when I chat with Pat Dangler or Jill from the North Woods about a topic, I wanted something a bit more tailored. On the videos today, you'll see a transcript that I've done some tender-loving care to produce, but still using somewhat automated tools. The same software can produce a closed caption file in .srt format that Steve can then embed in the YouTube videos. There are automated generated captions in YouTube, but we wanted to make sure they were very precise. There were some tricky bits of figuring out how to create it in the correct format within the software I'm using and the order in which I create the closed captions versus the transcript and how Steve would embed it. But I think we have it sorted into a pretty seamless workflow. We've only done three of them, but it started to get easier as we did more and more of them.
[10:08]Now, just to make sure I don't forget what I learned this week, I whipped out my trusty software Folga from folga.me and created a step-by-step guide with screenshots to remind myself next week. Now, if you like the transcripts and or the closed captions, please do let me know because it is extra work. Heck, even if you think they could be better, let me know that too. The bottom line is that we had a great time at CES-ing all of our friends, and we worked really hard to get a lot of fantastic content for you, most of which you won't find elsewhere. Rob Dunwood put it best when he said he wanted to follow us around to see what we do. After observing three or four interviews in progress, he said, I understand. All I need to do is get me a Steve. Maybe AI can make me one.
[10:55]
Tiny Mac Tips 4 Video Tutorial on ScreenCastsONLINE
[10:55]If you've been following along with my Tiny Mac Tips series in text form, or listening to them in the podcast, but you'd like to actually see some of them in video tutorial form, you're in luck. My latest ScreenCastsOnline video tutorial dropped and it's called Tiny Mac Tips 4. Remember my disclaimer, while you can get a free seven-day trial of ScreenCastsOnline to watch this tutorial and the current back catalog, you may find out that you love this service and want to join by going to ScreenCastsOnline.com. In this fourth round of Tiny Mac Tips, I teach you how to get access to more display resolutions, use QuickLook to triage files, use your iOS device to scan documents directly into your Mac, and uncover all of the hidden capabilities of the dock.
[11:41]
CES 2025: Tombot Robotic Puppies for Those Facing Health Adversities
[11:42]Well, I couldn't possibly walk by a booth that had puppies in it, so I've stopped to see Thomas Stevens of TomBot. Hi, I'm Tom Stevens. I'm CEO and co-founder of TomBot, and these young ladies are Jenny. Jenny is a fully interactive robotic emotional support animal inspired by my mother who after she was diagnosed with Alzheimer's dementia I had to take away her dog for safety reasons. I looked around for substitutes for live animal companions. My mom hated everything that she I brought home so I realized there was a large gap in the market.
[12:15]Our first product is designed to treat the behavioral and psychological symptoms of dementia and reduce the need for certain medications, including psychotropics. And these puppies, our aim is for them to be the first puppies that are FDA medical devices. So this is audio and video, obviously, but an audio-only podcast as well. So I'm looking at two little puppies here that look like little golden puppies, and they're moving their heads very naturally. and look, there's no Uncanny Valley here. Exactly. So one of the things I studied, I went back to school and got a master's from Stanford before launching the company. I ended up studying the Uncanny Valley as part of my focus there. And what I realized is that where human beings are very sensitive to the Uncanny Valley as we're dealing with other humans or humanoids, primates have been studied where they're looking at other primates or primate-like things. And the Uncanny Valley is actually a very real reaction an fMRI. But it's only between your own species? Like it's not as weird if it's a dog? That was my hypothesis that it would be, it was prominent in intra-species, but it would not be prominent in ter-species. And so we tested this. We teamed up with Jim Henson's Creature Shop, the people behind Muppets and Sesame Street. And with their help doing our artistic design, we've created multiple generations of high fidelity prototypes to figure out what seniors with dementia wanted. This one over here is actually our fifth generation customer study prototype.
[13:44]But she's rather fragile. I have to cover her ears when I'm talking about her. She's rather fragile and can't pass any of the many safety certifications. So we needed to fully re-engineer her. That's this one. This is Alpha 2, Jenny. We're just finishing up our Alpha stage. We'll be in Beta shortly. And then as soon as the Betas are working the way we want them to, we'll start fulfilling our backlog of orders. So now for the people who are just watching or just listening, these don't move around. Jenny's job is to sit in your lap and be petted and make you feel comfort. Exactly right. So one of the things we learned early on is that anything on the ground presents a fall risk for a senior with dementia. So products in this category are never placed on the ground, which means having the ability to walk or stand up or do things like that actually created a safety hazard. So these are designed as lap dogs, designed to lay comfortably on the senior's lap, but they're fully interactive. So covered with sensors. They can feel how and where they're being touched. Oh, really? Oh, I didn't notice that. They respond to a long list of voice commands, but only to their name. And we can, through our software app, we can rename the robots.
[14:57]They can feel themselves being moved. So if they're picked up, they know not to squirm and risk being dropped. and a variety of other sensors to help understand their environment and exhibit a behavior that's appropriate to their context. Can you demonstrate like something you're doing that's changing the behavior of Jenny? Let's see you just looked under jenny's tail is that where the controls are looking at an led light so alpha 2 jenny is a test robot and she doesn't always like to work so uh so like a real dog that not not going to obey on command right so uh so five mechanical subsystems uh the mouth opens and closes the eyes open and close ears head and neck and tail um and basically the movements are designed to accurately emulate those of an eight to ten week old labrador retriever puppy. Oh, I said golden, but the ears are perfect. Thank you. The way they move, that is really amazing. Yeah, working with Jim Henson Creature Shop, we got a chance to really study.
[16:01]Anatomical structures and then try to reproduce those through our mechanical electrical engineering. This is really, really interesting. So if people wanted to learn more about TomBot, where would they go? TomBot.com. T-O-M-B-O-T.com. And I think I heard you say to somebody else, you're in beta. We're just finishing alpha. These young ladies are at the end of their test period, and we're going to enter into betas. And as soon as the betas are working the way we and our customers want them to, we'll be starting to fulfill our backlog of orders. We have about 8,000 pre-order and waitlist customers, including over 200 B2Bs, which include the nation's largest hospital health systems, assisted living, memory care, behavioral health institutions, and so forth. This is amazing. Well, I hope these puppies get to, you don't throw them in a closet when their work is done. They get to sit around the office. You should see the museum in my office. It's getting a little crowded. That's a little creepy, baby. These are prominently featured when our engineers allow me to have them, which is for this week, getting them together. But anyway, yeah, we're very excited. We've been very well received in the marketplace, and we're excited to try to get these out into customers' hands. Very cool. I wish you the best of luck. Thank you so much. Thanks for visiting.
[17:21]
CES 2025: Soliddd - Vision Correction for Macular Degeneration
[17:21]My mother had macular degeneration and this is a degenerative disease of the retina where you start to lose your central vision over time uh it runs in my family and other family members as well so i was very interested in solid vision who are working on some glasses to maybe help people be able to see better who have macular degeneration and i'm here with maria i'm sorry i'm gonna get right? Madi Atia? And he's going to tell us a little bit about this product. Can I hold it? Allison, thank you for your interest. What Solid is doing is to develop smart glasses that would help people with partial damage to the retina, like macular degeneration, help them with true vision correction to restore sight. So it's through smart glasses, and I think you've got you've got some little displays here to talk about what it is yeah now from a question steve asked earlier when we did our preamble i understand this is glasses that have uh basically a display on the inside of the glasses so if you're wearing these you wouldn't be able to see, through it for now but what what's in that display so before we talk about the product before we talked about we talked about the technology uh let's talk a little bit about the human brain okay so what we discovered and I'm simplifying it that.
[18:50]Much like the way our two eyes creating stereotactic image we look at with two eyes and in our brain we see one image although we see it in 3d, you can accomplish that on in one eye on the retina so what we are doing we are projecting 64 tiles of the same image in focus that's the trick in focus to the retina and then the brain even a healthy retina the brain would take the 64 images and the visual cortex would reconstruct one full image so you talk about in focus oh for sure I've got so many questions you talk about in focus in focus for me and in focus for you is different how do you do that in these classes oh yeah you're just back to the details again we can do the details so so think about it if I manage to bypass your pupil I it's not dependent on your eyesight oh oh right so you're creating images that are a size that is smaller than the pupil.
[20:03]And what what we do and you can see it here now i'll try to explain the product for you, you can see it here on picture number one we have smart glasses with two cameras facing out and we capture this image and we manipulate it in such a way that your, retina would see it in focus so if you think about it you capture it here and I am going to project it on this side of the retina if I capture it on the right eye right side it would be on the left side now I have to project on the left side of my retina so I have to manipulate the image to be like that and And add to that, we had invented this flat optics, flat telephoto optics that allow us to do the projection. So now what's happened, we project 64 tiles. In focus to the retina and we have redundancy so if part of your retina is damaged the brain can still reconstruct it to one full image with 64 it's bound to have enough information to get you at least one part of your eye is that's functional more than one more than one because we also have and i'm going very technical now uh if you know a little bit about the retina.
[21:24]The resolution of the periphery of the retina is much lower than the central retina. So to get good resolution, we need to project more than one image, you know? Oh, I see. I see. That makes sense. So at this point, we can see this is at a prototype stage. Yes. Obviously, it looks very promising. Have you been testing this with real human beings? Yeah, that's what I was about to tell you. So what you see here that I'll show you in a minute. Okay, it's our first prototype. We called it the desktop prototype, and we tested it with about more than 30 patients with macular degeneration, and 95% of them saw improvement in the reading speed. Some of them couldn't read and started reading. Some of them read, but they read faster using this technology. Wow, that's astonishing. Yeah, I understand that it's really hard to get used to not being able to see in the middle, because you keep trying to look at it and when you look at it then you can't see it anymore so that i could see how the speed would be you'd have to keep training yourself to look off axis normally surprisingly enough even for healthy retina.
[22:36]The brain adjusts so fast. I'm sorry, I was talking about without this tool. If I'm trying to read and I can't see in my central vision, I've got to keep looking away in order to see things. And that's really hard for the brain to go, no, no, no, I'm trained to look straight at it. But with this, you don't have to do that. So people ask me to explain it in my lay language. I would tell people, imagine you have a camera that a few of the sensors in the middle are not working. That's actually what's happening in the macular degeneration, you know. Right, so in the desktop model what we're looking at here is taped down to cardboard, but he's got one of the displays sitting right here. There is a display with the special optics over it, and we're driving it through the computer. Can we see what it sees or anything? You can see, I'll show you what you have to do, okay? So first you have to take off your glasses, and you bring your eyes here. So he's putting his eye right over it.
[23:33]Display and then as you use the picture and you tell me what do you see you know this lady I see and if you go up you start seeing that we actually project 64 times okay I'm gonna give this a try I need you might need to change chairs or can we slide it over closer to you okay so instead of using glasses it's basically just taped down and so I'm gonna put my eye right down on this. All right. Get closer. Closer. Okay, so I'm looking at a woman. You know her? That's probably Taylor Swift is my guess. Correct. All right. Now lift your head and slowly, slowly, slowly. Okay, I'm lifting. Do you start seeing the thighs? Yes, now I can see the individual. Oh, wow.
[24:22]Oh that's so weird see if you got to look at this that is really really interesting so i to perceive it at the beginning when i saw it the first time i didn't get what's happening you know you do have to back away to start understanding you're looking at 64 individual images i got to look at it again okay i'm up real close as i come up.
[24:48]That's really i know that's terrible for audio maybe interesting for video but it is hard to show it because actually it happens in your brain in your visual cortex oh that's interesting well we might have to train the capture it to explain it to.
[25:03]Friends and I realized what I don't know how to actually I think that might make some interesting video to help understand that it is my brain doing it not my not by my eyes doing it not the sensors that are my eyes yeah correct so you've got a uh prototype now uh when do you have any idea when you're supposed to go into production.
[25:24]Or you know uh technology is challenging our plan right now shows that it will happen in the spring summer of this year you say this year you'll have a product yeah yeah this model would we hope would be available in the spring or summer of this year so how many millions of pre-orders do you have we don't have millions you will so we our founder is very neil weinstock is very conservative so i first wanted to see that it's working he's now raising the money and first we'll be the we'll test the prototype and then we'll go to production this is this is spectacular uh macular degeneration is the leading cause of blindness in the world so this is a problem just it should to be solved that's about and i'm not good at the numbers but i think here in the u.s we're talking about 20 million people suffering from macular degeneration i live in fear of it i actually have dry macular degeneration right now which is not all dry turns into wet wet is the bad one but all wet was at one time dry so i live in fear of this so maybe this will be all working by the time i get it am i allowed to give you advice sure just keep doing follow-up with your ophthalmologist.
[26:44]I do i go i go every six months to a retina specialist and i take all the vitamins that they know it can help but uh so i mean it might never happen but one day they would find a cure but right now they know how to slow the progression of macular degeneration they don't know how to cure That's right. So if people want to find out about Solid Vision, it's S-O-L-I-D-D-D Vision. And is that SolidVision.com? So it's Solid, S-O-L-I-D-D-D, Solid3D.com.
[27:18]Say that one more time. It's solid, S-O-L-I-D-D-D dot com. There we go. Got it. Thank you. There is one reason for that, because the beginning of this company, this is a startup of 15 years. So in the early days, they were trying to develop 3D technology for 3D movies. Oh, wow. Okay. I like it. I like it. Now I'll remember the name for sure. Well, thank you very much, Marty. This is fantastic. Pleasure. Thank you. Well, I really thought that was interesting. And I really, really, really hope he's right that they can actually get into production in 2025. I have to say that I think it's a big leap to picture a product that you're actually shipping based on the fact that we were looking at a sensor that was taped down to a piece of cardboard. Now, I know they had some goggles there, but they didn't let us actually look through them at the glasses. and maybe it was just because they didn't think it would be effective on us because we don't have any kind of degenerative eye diseases, but it still seemed really optimistic. I hope he's right and even if it takes a year from now, that is still going to be an extraordinary achievement if this works out.
[28:33]
CES 2025: Monar Canvas Speaker Blends Art, Music, and Technology
[28:34]Well, we walked by what looks to be a picture frame that's got a lot of different graphics going on on it, but I think it's also a speaker. So I'm here with Ben Yu from Monar to tell us all about it. Thank you. Hi everyone. So I'm Ben from Mona and the product is called the Mona canvas speaker. So first of all it's a canvas that we have over 50,000 of masterpieces that you can display on the screen. So he's got a he's got a phone in his hand and he's bringing up the masterpieces there's the girl with the pearl earring. Yep and like Van Gogh, Monette you have all of them and secondly is a speaker that we have official called the live lyrics that it displays different designs of the lyrics in real time. And also we have integrated AI with this, that's AI generates images based on the lyrics of the song that you're playing in real time as well. So you've got the song to lyrics to masterpieces, but then which picture is it picking? Well you can choose masterpieces or you can choose AI to generate based on the lyrics off of the song. Oh, wow. Well, it's both ways. Where is the music coming from? Do you have to have, in your app or, yeah, sorry.
[29:45]You're using Spotify or whatever? Yeah, yeah, with Spotify, Apple Music, Apple Music, YouTube Music, Deezer, anything. I got you. Okay. Okay. So let's take a look at this. So we see a very bright display here that has some cool-looking AI-generated graphics. But what's going on behind the scenes? He's going to go that way. Let's go this way. So if you look closely, we have eight speaker drivers built in here. Two subwoofers to play the bass. and then the two Mets at each side, and also two Twitters, one Twitter on each side, totally there are eight, so it's a 2.1 channel stereo. Wow. Yeah. So, and also we have over 3,000 milliliters of speaker volume to make sure the echo is really good. That sounds great. So, when do you expect this to be available? Oh, we will be on Kickstarter very soon, probably March this year, just in two months' time. All right, this is a really fun product. Thanks for talking to us. Thank you so much.
[30:48]
Support the Show
[30:48]Well, CES is a lot of work, as I've described, but it does also cost money. Hotel rates go through the roof during CES, and we don't even stay in high-end accommodations, I assure you. And the food is super expensive while we're there. Luckily, we drive and have an electric vehicle, so at least we're not paying for plane flights. If you get value from CES content we bring to the table, consider becoming a patron to help defray the costs. You can do that by going to podfee.com slash Patreon and choosing an amount that shows your level of appreciation.
[31:20]Music.
[31:29]
CCATP #806 — Andrea Jones-Rooy on Being a Data Scientist
[31:29]I have become a major fangirl of Dr. Andrea Jones-Roy. They're a data scientist and a stand-up comedian and a circus performer. I mean, how could they not be fun, right? Andrea is a professor at the New York University Center for Data Science, and in that capacity, they created a 20-part video series called Data Science for Everyone. Now, they're not kidding. This is the best part. The video series is free. I thought, hey, I'll give it a try. I ended up watching all 20 episodes of this video series, and I loved every single minute of it. And Andrea also includes a free online written textbook, but I am going to confess I did not read that book yet. And I didn't do any of the homework that they told us to. Anyway, with that, welcome to the show from your favorite teacher's pet, Andrea.
[32:13]Hi, it's so nice to finally meet my favorite teacher's pet. I can't tell you how scarce pets are these days. So this is very exciting to be here. And I honestly, I think you win the prize. I'm not sure anyone has watched all of them besides you. So correct me if I'm wrong, listeners. One of them you did say, you said, I don't think anybody's still listening. I mean, who would still be watching this? I mean, there were some, the idea originally with these, so it's not the full course. The idea was, it was like, I'm going to do highlights of the key concepts so that like students can refer to it. And I think they're important. So maybe other people taking with an interest in data, whatever. But then as I went, I was like, it's really hard to just pull out highlights without explaining everything. So I must have said that on like, you know, minute 20 of what was supposed to be a four minute video or something like that. So is anyone out here terribly long though? I mean, they're like 20 minutes, 25 minutes, something like that. They're bite size. I just, I crammed through them in like three or four days. I really enjoyed it. Yeah. Wow. Well, thank you for watching. I actually have a few more that I never published, but I think they're more like wrap-up sort of thing. So if after a while, yeah, yeah, yeah, just sort of end. You're just like, then goodbye. No, there is something of a goodbye that I need to post the final season, I think. To be able to tell that we're done. Well, let's start in the middle of this. There's so many things I want to ask you about, but let's start in the middle with, so what is a data scientist?
[33:36]So a data scientist, if you ask anyone, they'll probably tell you like 200 different answers. Like, I don't even think a data scientist has a concrete definition of a data scientist, but my version of a data scientist is someone who uses data and science to make discoveries about the world. And you might be asking yourself, isn't that just regular science, Andrea? And the answer is kind of, yes. And I prefer a more inclusive definition than a more specific one. But usually in data science, the way that we're using data...
[34:05]Through computation or computationally intensive methods like machine learning or other algorithms or AI or whatever kind of technology is of interest to you. And so you're using computation and statistics in clever ways to learn more from data than perhaps you might if you were just doing, you know, like a mean and a couple of scatter plots. That said, I think you can learn a ton about the world from means and scatter plots. And I think in the excitement about machine learning and all this other stuff, we've forgotten that you can learn a ton from very simple methods too. So that's part of why I'm so broad about data science. Like if you're using data and you're being scientific, I call you a data scientist. I loved at the very beginning, I was yelling at my iPad saying.
[34:50]Oh, you know what would be great, Andrea, is if you knew about that site that correlates things like the amount of cheese in Wisconsin to earthquakes in Zimbabwe and like the next lesson you put it out there to explain correlation yes tyler i don't know how to say his name tyler or her name or their name tylervegan.com v-i-t-e-n is uh is the site it's amazing i've generated priceless some people have friends and hobbies i just sit and click generate on the spurry if you google spurious correlations i think you'll pretty much get to it but it's yeah it's like divorce rates in norway correlate with droughts in kentucky whatever it's amazing it's beautiful. Yeah, there's usually a movie released by... Yes.
[35:35]I can't remember his name. Nicolas Cage. Nicolas Cage, yeah. Nicolas Cage movies correlated against other things are great. Yes. He correlates with a lot, it turns out. I think if I ever become a conspiracy theorist, it's going to be full-on Nick Cage-related conspiracies. You heard it here first.
[35:52]So you mentioned the idea of using data and science. And like you said, I mean, to me, that's what science is, is science uses data. And people who make decisions that aren't based on data, that bothers me. I'm all about data. But you also get into talking about using data where you can't do an experiment to create the data. So I'm used to, okay, I developed this theory. Let's say I'm an astronomer and I'm going to be doing, you know, not the experimental side, I'm the other side that I can never remember the name of. Theoretical side. Theoretical, thank you. So I've developed this great theory, and then I need to be able to create a test to see whether that theory is true, and so I need to collect data and figure out whether it maps into that.
[36:44]Well, astronomy is probably a bad example because you can't create that either. You can in the Large Hadron Collider and other experiments that you do locally, locally as in on Earth in massive locations, to test theories about physics. Because you're exactly right. Even in astronomy, you're limited to... What's called observational data, as opposed to lots of scientists where in medicine, for example, you get to run experiments and that's like the gold standard for inference, right? You go measure things on a mouse, poor mousy, and you do these experiments and figure out what works and what doesn't and develop theories and test again and that sort of thing. But in the fields where you do data science, you don't get to start there. You have to start with data that exists, Because I think one of the examples you gave was if you wanted to see whether cigarette smoking causes cancer, you can't take half the group and say, OK, you have to smoke because apparently there's issues with that.
[37:40]I mean, it sounds fun to me. And but I just couldn't get it past the ethics board at my university. So maybe in the days of Don Draper, they could do, you know, cigarette studies. But that's exactly it is, you know, the the the distinction in. So my background is in political science. And so I do a lot of research about political phenomena. So how are people going to vote? That might be on a few people's minds these days. Or how did people vote? Why did they vote the way that they did? Or even things like wars and what kind of regimes lead to economic prosperity or what kind of democracy versus autocracy? Is that better for the health of the citizens? you can't randomly assign democracy to some countries or randomly assign a civil war and see what happens right right i mean in some ways you can kind of mess with the system a little bit if you're see and now i'm becoming conspiratorial again i think you have to believe in a deep state to be like actually you could run an experiment on us but generally speaking the things i'm interested in we don't get to run the most recent u.s presidential election again under a different treatment what if we had more carefully you know what if kamala harris had gone on the joe rogan podcast? What if the economy, you know, she had messaged differently about the economy? And a lot of people have arguments that say, yeah, then she would have won if she did this or she did that. The only way to really tell scientifically is to literally run the universe.
[38:56]A second time with this one thing changed. Just like in a drug trial or these poor mice, right? It's like, I'm going to give, everything is the same between these two mice, except one gets the drug and one doesn't. And we see if they get better, right? And you only change one thing at a time. That's controlled experiments, but you can't do that. Exactly. You cannot do that, both for ethical reasons in the case of civil war and assigning people to smoke and for practical logistical reasons. We can't run 10,000 elections in the United States all at the same time. It's just until time travel is invented, we can't do it. And so what you're working with then is observational data. And what observational data means is you have to be a lot more cautious when you try to learn things about that data, particularly as it pertains to causality. So the reason we do randomized controlled trials is that those are the closest we can get to causality. So if I think A causes B, I need to run two sets of the world where the only thing that changes is the presence of A in one and not in the other. And if A causes B, when A is there, we'll see B. And when A is not there, we won't see B. And even then, you probably want to do it a bunch more times with other variations just to make sure that it's not some C, D, or E, or whatever. But you really can't do that for most data that we're interested in.
[40:05]And so that means you're looking at data that already exists. And that means you have to think very carefully about what you can learn from this data and be very transparent about what you cannot learn. And so that means websites like this Spurious Correlations one are so instructive because you can find correlations between all kinds of different things that may have no causal relationship whatsoever.
[40:29]So for example, the, you know, Nick Cage movies tend to correlate with people drowning in swimming pools. And probably it's not the case that Nick Cage is causing people to drown in swimming pools. We could kind of make a story. I usually like to make my students, it's a bit dark, tell a story. Like, why might.
[40:46]What if it were a causal relationship? It's like, well, we really don't like the movies, so we kill ourselves. It's horrible, right? Or you're so entranced by the movie that you just fall into a pool. It's difficult. But the correlation is really, really strong. And so that's just one of many ways that we need to think about data, not just run the analyses and then infer whatever we want. So because we're working, as you can see, I talk about this forever, when we're working with observational data, there's a lot more kind of like careful reasoning and thought that needs to go into the conclusions we draw because we can't control what generated the data in the first place. That sounds really hard. There are some cool things. Drawing those scatter plots is fun, though. it's very fun the other thing i'll say is the you know the line correlation does not imply causation is is important it's the first thing we all learn uh in statistics or one of the first things but it's also important to remember that it doesn't not imply causation so if you're saying i think there's a correlation between uh let's see let's just keep going with nick cage and pools if i think there's a correlation between nick cage movies coming out and people drowning in the pool. And I think that that correlation is meaningful.
[41:58]It could be. Just because it's a correlation doesn't mean it's not meaningful. There could be a causal relationship there. It just means we can't say for sure. So a lot of times the best we have is correlation. So here's a more concrete example with cigarettes. It took forever for the United States government to pass laws to either put labels on cigarettes about how dangerous they were, to limit the marketing of cigarettes to children, to not advertise them near school, all those things that we, you know, ban smoking indoors, all those things, because the cigarette lobby, the tobacco lobby said correlation does not imply causation. And because we can't do a randomized controlled trial and randomly assign 20 people to smoke for 50 years and see what happens to them, because we think it already, it might cause something bad. We only have observational data. We only have correlations and we have to do our best with that and say, look, we've looked at it every which way. Even though the best we've got is a correlation, that doesn't mean there's not causation and we've checked for all these other things because you could say, well, if correlation does not imply causation, then why else might people who smoke a lot be more likely to get lung cancer? And maybe you make an argument that it has to do with where they're living and it's some other environmental factor or maybe it's a level of stress that's causing both. You have to turn over all those rocks to make your case. But sometimes correlation is the best that you have. Well, and of course, you should still continue to, the government should still continue to subsidize tobacco growing.
[43:24]Oh, yeah, of course. That's the key. Yeah, yeah, yeah. Exactly right. Yes. I mean, the real challenge, and now you've got me on my high horse, the real challenge with data and with science is that data and data science are both very much about uncertainty and about being like, yeah, we could be wrong. We've done 2000 studies tracking smokers and nonsmokers and their health outcomes. But as true scientists, we need to be completely transparent that the next study we do could suddenly reveal that it's it's actually super healthy for you. That's very, very unlikely. But truly good science is one that says, yeah, we just don't know for sure because the world we maybe just haven't studied it the right way. That little sliver, unfortunately, is where tobacco companies and lobbyists and we see the same with climate change. We see the same with vaccines. Any you can exploit the inherent uncertainty in science to cast doubt on all of it, even though it is pretty, pretty convincing.
[44:26]And the uncertainty is part of what what makes us scientists is admitting we can't know everything right i love that point because um obviously we saw that with uh with vaccines and masks and everything is that the the anti-science people would say oh look you said we had to do this and now you're saying we have to do that and you so you don't really know so science is all bs and i i have a great example and it kind of folds in you and i were talking beforehand about uh playing around with AI like ChatGPT and asking questions and when it makes up answers. My favorite example about that, and it gets back to this point, is I asked it, the first thing I ever asked ChatGPT was, tell me about the moons of Mars. And it's Phobos and Deimos. And I outspit this beautiful article that was like, it was written out of Sky and Telescope magazine. I mean, it was articulate. It was fascinating. It was really good. And one of the lines in there said, unlike all of the other planets in our solar system, the moons rotate in the opposite direction.
[45:25]And I thought, now I'm no astronomer, but I think I would have heard about that. And there'd be some real physics questions in there. Like, let me go double check this. And I look, and it's of course false. That isn't what happens. However, imagine that you're an astronomer and you're living on Mars and you look up in the sky and you see the two moons rise in the West and you see them set in the East. You would draw that conclusion, right? That is a scientific observation. You can prove it. You can see it. You write it down. It's in the scrolls and you do it. And then you get spaceflight. You go out and you look at the planet and you discover that Mars is rotating or the moons are rotating so slowly that they're slower than the rotation of the planet, which makes it look like, well, they actually do rise in the west and set in the east, but it makes it look like they're rotating in the wrong direction. So that's a case of where new scientific evidence disproves something that was incontrovertible when you could only stand on the planet. And I love that example is exactly what you're trying to you were talking about. That's such a good example. I did not know that one. And shout out to Chad TPT for being confidently wrong enough that you looked it up to confirm that. A friend of mine gives tours at the American Museum of Natural History here in New York City. And she once was like double checking some of the facts. It was like a bug exhibit.
[46:46]Before she went on the tour, she was like new to the particular wing of the. Anyway, she was asking Chad TPT something. And it said, you know, like bugs, humans also have exoskeletons. skeletons. And she was just like, what? Like, like, it's really sometimes very wrong. But but the the point of, you know, something that is taken as cold, hard capital T truth, you know, Earth is the center of the universe is is is another good one. Those are all really good examples of the kind of humility that we need. Stop me if you know this one. But my favorite one, like what you described, is the the tale of the inductive turkey. Do you know the tale of the inductive turkey. This is the children's book to traumatize the youth. So the inductive turkey, so inductive reasoning is what we're describing. It's like you observe something and from all those observations, you draw a broader conclusion. Ah, the sun must rise on this side and set on that side, right? As opposed to deductive, which is you start with a theory and then look for evidence like we talked about earlier. So the inductive turkey is a turkey that lives at a farm and every day, every morning, the farmer comes out and opens the gate and hands the turkey food. And then the next day, the farmer comes out, opens the gate, hands the turkey food. Next day, so the turkey uses inductive reasoning to conclude, wow, the farmer is my friend. The farmer is going to bring me food every day. And that happens for two years until, lo and behold, the farmer comes out, opens the gate, and kills the turkey. And you're like, oh, it's true until it's not. This is really a children's book? It's not. Oh. I'm working on it. Yeah, no, it's not a children's book.
[48:12]It's a book that I traumatize my students with. It's not a book at all. It's a story. Oh, okay. In a random social science book that I read once. But you say that, right? And I can say that to smart, thoughtful, science-minded people like you. And it's like, yes, of course, we need to have that kind of humility and know that the next study could show everything's wrong. But again, the problem is, is that you you put that language in front of, let's call them bad actors or people who are not truly trying to understand the world, but are trying to push an agenda. And you can tell whatever story you want, which is why we see headlines like every other day. They're like coffee cures cancer. Coffee causes heart disease. Coffee's going to kill you. Coffee's going to make your life better. You know, because every single study is a teeny tiny bit different. And they're focusing. Is it the caffeine? Is it the whatever coffee is made of? Is it something else you know that you're gonna get all this noise and over time you kind of get a clear picture but as you said the minute you go out to outer space you might say oh look we've been looking at the moons all wrong like so we just have to always be open to that possibility yeah one of the things uh dr mary and gary has been on the show a bunch of times and people are big fans of her and one of the things she said to me was on one of the episodes was when you hear about a study ask your question one question yourself one question compared to what so when They tell you the Mediterranean diet is going to make you have low cholesterol and you're going to be thin and you're going to be beautiful and have long, dark hair.
[49:33]But you've got to look at compared to what. So what are those comparisons every time? And is that part of what you have to do when you're trying to draw causations from big data or small data?
[49:48]Right. So that is exactly, that's a great question to ask is, yeah, compared to what, what are we, what is the, in this case, we're looking for something that would be akin to the control group, right? So is it someone who isn't eating a Mediterranean diet? Well, then we would want to know, well, what kind of diet are they eating? And you know, all of that. My, my advice is anytime I see a study, and I really encourage this to be for everyone, you don't need science training to do it, is anytime, you know, you're reading the New York Times, you see something on Instagram, that's like a cool scientific study. Go to the study itself. Yes, there's a lot of jargon. Science is, broadly speaking, getting better. As you move to the social sciences, there's less jargon because we all know what voters are in a way that we might not all know what mitochondria are or whatever, right? So there's a little bit less jargon. But it's worth, even if you don't fully understand all the jargon and a lot's behind a paywall, no good. But even looking at the abstract helps you get a sense of like, oh, this, so I can see two headlights. One, coffee kills you. One, coffee makes you live forever, right? Mediterranean diet makes you live forever. One, the Mediterranean diet is terrible, whatever.
[50:49]Go to the study itself, and the study itself is going to be something very, very, very, very, very small and specific. A good study, you know, kind of unfortunately needs to be very small and specific. One of my dissertation advisors used to always describe it as like an hourglass, like your big question, you know, is coffee bad for you? Is the Mediterranean diet going to save your life? is interesting. And then the headlines are even broader than that because they want people to click and it's clickbait. But the actual study itself is going to be like, well, we had 12 men and 12 women take a Mediterranean diet cooking class. And then a month later, we tested their blood pressure. You're like, what?
[51:26]And maybe that's informative. And some of the studies are, many studies are better than that. That's why I'm not in health sciences. But that's often what we're working with. You're like, oh, I see you.
[51:35]The study has necessarily to do this randomized controlled or approximate that as close as we can. You really do have to be so very specific and say, OK, what we mean by this is they just added olive oil instead of Crisco to their cookie. I don't know. You know, and it was funded by the Olive Growers Association of Italy. Well, that's the other thing you want to look for is who's funding these things.
[51:58]So I have a podcast as well. Not to bring that up, but I you and I had emailed about an interview I did with someone named Christy Ashwanden, who is a science journalist and an athlete. And she talks about similar sorts of things where there's a ton of research in sports science about how to make sure that you're sufficiently hydrated and worrying about being dehydrated. And it turns out there's a whole field of hydration science. And there's all this research about like, you want this amount of salt and this amount of potassium and these electrolytes and this amount, and it turns out the field of hydration science was invented by a company that provides sports drinks. Like, it's just, you do need to have, again, you have to have that skeptics hat on when you look at all of these things. So when you look at any data, so one of the other things I encourage people to do, as you've seen in my videos and in my life, is don't be shy about finding data in the world and downloading it and mucking around with it. I think that, you know, this is changing in some fields, but I think all research should be open source and data should be available to the extent that it's ethically possible. It's not always if it's sensitive, like health information. But if you can download the data and just start looking at it, even with a simple scatterplot, the first thing you do, you're in Excel or whatever, you have your scatterplot, you say, wow, there's a positive correlation between.
[53:11]Long, beautiful hair in the Mediterranean diet or whatever data it is that you've downloaded. The first thing you want to ask yourself is what are the plausible stories I could tell that would give rise to this picture? And there's almost always going to be more than one. If anything, there's probably going to be dozens. And then your next question is, okay, how do I distinguish between these different possible stories? Are these correlations completely spurious? Is it the case that there is some causation? If so, what's causing that causation? And then you go and find data on that thing and you kind of keep drilling down as best you can. I don't know if that makes any sense. Yes, it does. And I want to circle back to Christy, but I want to point to this exactly. I discovered that the Department of Energy has, you can download massive amounts of data on electric vehicles, on charging stations and, you know, is it light trucks or what, just all kinds of great data that you can download and you pick and choose little columns. And one of the things I want to look at is how many more charging stations are there than there were a year ago. And so I downloaded October 2023, October 2024. And I did – you love pivot tables, right? I love it. Okay, good. Yeah. So pivot table – I knew we would bond on that one. Yes. So I did pivot tables on it. Bringing people together since they were invented.
[54:27]And I was able to see that there was this massive increase. It was like kind of 34% or something, and I did it by company, how much they each increased and everything. And then I was on Bodie Grimm's podcast, the kilowatt podcast all about electric vehicles, and I was talking to him about it, and we were going through it, and he had gone through it as well. And I said, well, this was really cool because I could see that the – I think it was the Inflation Reduction Act that put in money for new charging stations, and this is great to see the evidence that this actually happened. He says, Allison, there hasn't been enough time for anybody to build a station with that money yet. Yeah. Oh, man. Yeah. But that's exactly the kind of reasoning you want to go through. You did it exactly right. And one of the other things you did that's great that too few of us do is, well, A, downloading the data and mucking around with it. You were already a teacher's pet. Now you're a teacher's favorite pet, right? That's amazing. But then it's exactly that. It's looking through it and then sharing your findings with others. Peer review. you. And particularly people who know the area that you're talking about better, right? And then they can say, nah, it's probably not that. And now that you know, you can keep digging and say, well, what is it that caused this thing to go up? Or what are the likely candidate stories? And which one seems the most plausible? And you keep going from there. Often the best you can do is most plausible. And that's kind of the highest bar. You also talk about, and this is a really important thing in science, is that a valid scientific theory is one that can be proven wrong. Correct. Can you talk about that a little bit? Why is that important?
[55:57]Yeah, so you can never prove a theory right. And that's something that I say to my students every day and still don't always get through. And it feels counterintuitive, but basically, you know, you can prove...
[56:12]Like in math, there are proofs, right? And you can say, because of this, this, and then we're done. But otherwise, just like the inductive turkey, just like the observations on Mars, you can't prove a theory correct because you never know if the next observation you see is going to be one that refutes the entire thing. So the best you can do is fail to disprove your theory. And that is so clunky. It's so kind of pessimistic too. It's like, oh, I failed to disprove my theory. You're like, just own it. You had to win, but you didn't. You just failed to disprove. So what you're doing, ideally, as a scientist, is you're saying, I have a theory. I think that the Inflation Reduction Act caused electric vehicle construction to increase. I need to look for evidence that if I'm right, how would that show up in the data? Well, in this case, it would have to be that the construction happens at a time frame long enough out from the act.
[57:07]You didn't find evidence of that or you don't have evidence yet. So, you know, in this case, you're disproving your theory. If you did find it, you say, OK, I failed to disprove my theory. Maybe this is two years from now and electric vehicle growth still continues. And you say, well, it is plausible in this case that the Inflation Reduction Act did do something in the second year to cause increase. And maybe the increase was even higher than the first year. OK, so you haven't proved your theory. You've simply failed to disprove it. Yeah, yeah. Right. Well, I first heard of this when my husband, Steve, does a lot of research into physics. He loves physics and watches a lot of stuff and everything. And he was talking to me about string theory. And he said one of the problems with string theory was it is not possible to prove it wrong. And if you can't prove it, if there isn't a way to prove it wrong, it just doesn't count. You don't get to do that. You can't really depend on that one if there's no way to prove it wrong. It is part of the, I would say, the strength and kind of the...
[58:02]The the like less fun part of science is like you're kind of limited in your theories to things that you could test. And, you know, it keeps us from being like, I think it's six octopus gods that are running because I can't prove that it is or that it isn't. And so it's sort of not helpful. Right. You know, religion sort of lives in its own separate place. And and, you know, I'll set that aside. But as far as science is concerned, yeah, you do have to limit yourself to like, I think it's the case that this is happening because of this. There must be some way to prove you wrong otherwise you're not doing science and that's sort of what when you get into like flat earthers and things like that is the they're sort of like and conspiracy theories get very difficult because often the evidence of a conspiracy theory it is the lack of evidence and so you're not you can't prove it wrong right you're just like there's no evidence that it wasn't you're like oh shit yeah but you you know but the the one leeway is you don't necessarily have to, you want a theory that you can test outright. Often what you're working with in political science, for example, we often are working with theories that you can't possibly fully test the implications of. So there's a theory in political science and international relations called audience costs, which is that the theory goes that sometimes leaders do things.
[59:19]Because they don't want their constituents to be mad at them, even if it's not the best thing for the country. So if I say, hypothetically, I'm going to invade you, scary dictator, if you don't back down, the scary dictator doesn't back down. I would prefer as a leader to not start a war because I've publicly said all this stuff. And because I have an election around the corner, now I have incurred these audience costs that are going to force me to then have to follow through on, it's like a commitment device to follow through on why I said people have applied that to Bush and the Iraq war and some of the things. You can't really test all of that in a single study or even decades of studies, but you can get at little pieces of it. So if that's true, then we should see more wars or more invasions initiated by leaders that have publicly declared their intent to invade.
[1:00:07]Okay. But we still haven't proven our theory because it could be that leaders are declaring their intent to invade because they're going to invade. It has nothing to do with audience costs. And then we say, well, what are the audience costs? Now we need evidence from a polling outfit that shows that people really cared about the leader following through on what they said. Okay. And then there's experimental research, like true experiments, where we say, okay, I'm going to put half of people in one room and half people in the other room. And one half is going to read a bunch of articles about a leader that drummed up energy for a war and then back down. And then another that drummed up energy for a war and then invaded and see if which side prefers the leader, and it turns out. So you can get at pieces of it. So you don't have to be able to test the whole grand theory, because no theory can be fully tested. And the other piece is, you know, think Einstein's theory of relativity.
[1:00:53]If you can articulate what a test would look like, even if you don't have the technological capacity to run it, that can count. We didn't know, we weren't able to test implications of Einstein's theory until I think after he died, in some cases. And so I don't know enough about string theory to say how much falls under one How about the Higgs boson? I mean, Higgs got to live until it was proven, right? Yeah, yeah. But we knew that it was possible to disprove it. There was a test. I want to circle back to your podcast. So your podcast is called Behind the Data at behindthedatashow.com. And A, it's fantastic because I'm teacher's pet. I'm going to admit, I'm still a little bruised on the whole political thing. So I haven't listened to the last two that are about political stuff. But when you were talking to, is it Christy or Chrissy?
[1:01:43]Christy. Christy. Christy. Yeah. So she was talking about the hydration studies. But I loved when she just sort of obliquely mentioned using fitness trackers to track sleep. And I sent you my article, because I do like clickbait, that I entitled, Sleep Tracking is Stupid. And I've had a lot of fun response to that because, of course, I was poking everybody who loves sleep tracking. My theory, which I have not tried to disprove, is that people just like data about themselves. That's all it is. If we had a way to every day measure the size of our ears, I think you could sell an app that does that and get a whole group of people excited about it. Now, that's a study. So that's kind of like what we would call a field experiment where develop that thing, see if you can sell it. And if you can, we won't prove anything, but we'll fail to disprove that people will buy any data, any device that tracks data about themselves. And you can make up some science sounding things about how like, you know, it's correlated with longevity. Some garbage, right? It doesn't have to be true. You can just say whatever you want. Sleeping on your left side or something like that. In fact, if you wear the Aura ring and track the size of your left ear, that would be really good. Right, right, right. But the thing that I poke at with this sort of tracking of yourself is that if there's no actionable information taken from it, then it's simply, huh.
[1:03:10]That's all. That's interesting. Like, I already woke up exhausted. Knowing I woke up 17 times didn't actually help me not wake up 17 times. Exactly. And it might even make you feel worse because now, you know, you you you know that you didn't sleep well, even though, you know, something about seeing it. So, for example, I was very, very sick in Macedonia once a long time ago, and they record temperatures in Celsius. And as a true American patriot, I don't know what Celsius means. I like think about data all the time. I cannot get my head around Celsius. So I'm in bed. They're taking my temperature. They're saying, wow, your temperature is 40. Your temperature is 40. And I'm like, okay, whatever. And then finally on day three, I drag myself out of bed and I, because this is pre-cell phones, Google, what is 40 degrees in Fahrenheit? And it's like a lot. It's like way over 100.
[1:04:00]And I was like, oh my God, I'm sick. I'm way sick. And then I felt really sick. Yeah. What's that? Yeah, you're way sicker now that you know. Yeah, you can really, you know, it's like when a kid falls, they don't start crying until they look up and see their parents' horrified faces. So, and Christy makes a similar point to you, which is like, you know, we have built-in devices in our bodies to sense if we're tired or we're thirsty or we need a nap or we slept well and maybe start to go on that. Because, sure, you know, I think your summary of it is exactly right. It's like you're just kind of like huh okay and you're just like but then you feel like you want more information you're like well why was i waking up and where was i going like maybe if you could start to do that and you know of course i wouldn't you know begrudge anyone who really felt like they needed to go into like a sleep study or a sleep apnea or whatever 100 but most of us don't need this data it's right it's very silly yeah um yeah there was uh jill from the north woods actually She wrote back, wrote another article for my website entitled Steve Tracking Isn't Stupid because she happened to discover she was waking up at like 2.13 every single day.
[1:05:08]Why 2.13? So she set an alarm for 2 o'clock and she waited till 2.13 and it turned out like the heater was turning on or there was some device near her bedroom that was turning on and she changed the timing of it so it would stop waking her up. Okay. Okay. There's one. Yeah, there's one. I mean, it's one of those two where it's like the real answer is somewhere in the middle and I would say closer to your, and I agree with you on the stupid side of things, like to our stance on it, where it's like, yes, there are cases where this information is helpful, but those cases are rare. So think of it kind of, driving a car like there's only so many pieces of information on the dashboard if there was a bunch of other information i don't know enough about cars to say but a bunch of other information that you didn't really need unless something was very very wrong you would just look at it and fixate on it and start freaking out for no reason when really you don't need to start paying attention to that until you notice this other thing you know the temperature has gone up my um friend of mine steve novella who's a medical doctor who who was also on the podcast he was fantastic oh he's so good. He's so good. His show, The Skeptic's Guide to the Universe, is all about the kind of critical reasoning that we've been talking about this whole time. So if any of your listeners are interested, I highly recommend Skeptic's Guide to the Universe. He talked about the same thing, unprompted by me, in medicine, where it was something, I'm going to get this wrong, but the idea was that doctors for a long time, when they were delivering babies, would track the fetal heartbeat and the fetal heart rate.
[1:06:33]Because the idea was, well, a beating heart is a sign of health. So why not keep track of that during birth? And maybe that can help improve outcomes. And they were doing it and they were doing it. And it turns out something like whenever the mother pushes, the heart rate stops because it slows down something, right? Because of just how the body is working and what's happening to everyone in this crazy situation.
[1:06:56]And that caused doctors to freak out. And they did a lot more emergency C-sections than they should have for some number of years before everyone figured out, oh, this is why we're doing this. So it's actually better if we just don't have that data and that didn't improve health. It turned out to be superfluous data. And so you could argue that it's good. They went through that and now they know. So we can all track our sleep for a year or two, figure out the one or two interesting things and then let it go. I'm until I'm really feeling generally horrible. I have deleted all sleep tracking from the other. I mean, the example that I come across a lot, I do a lot of consulting in like corporate, nonprofit organizational world where they're very interested in, we have all this data on our employees, on new hires, on people we want to hire, clients, customers. How can we optimize whatever relative to them? And you're like, well, it gets shady. But basically all of them want dashboards. They're like, we want dashboards about all our people. And we want dashboards about all our customers. And you're like, I promise you, you don't want dashboards. Just sitting and staring at a whole bunch of numbers, you're like, okay. Unless you know what you're looking for or what you said what your action will be as a result of this data.
[1:08:05]It's not worth tracking that data. It really isn't. The example I like to give on that is, and I got this from a metrics class I took at work, where the guy said, if you want to lose weight, the thing you don't want to do is weigh yourself. Exactly. Because weighing yourself, it's just like sleep tracking. It is the result. What you want to do is you want to measure how many calories are you burning and how many calories are you consuming, and then you track your progress with weighing yourself. And I remember telling the guy, I said, yeah, you know, I think you're onto some because I tried weighing myself twice a day and I still didn't lose weight, you know? It had no connection to it at all, but it's the exact same thing. And until they have sleep tracking apps where it somehow automatically knows that I had three glasses of bourbon right before I went to sleep or I played video games right till five minutes before I tried to go to sleep, until it can measure those things or you have a way to input it that's that's you know easy to do I don't know that it's really gonna help people right I did I will come I'm working on a current study right now developed yesterday the day before with a friend of mine um uh my friend Damien uh is whatever friend of mine he's the other day he posted a bunch of stuff on Instagram on his stories that were really funny and I messaged him and I was like your stories are killing it today it was like just super funny weird stuff he's a funny guy he's a comedian or whatever. I was like, you're killing it today. And he said, thanks. I'm completely exhausted. So maybe that has something to do with it. And I was like, maybe. And then we kind of went down a Nick Cage drowning funnier when we're tired.
[1:09:35]Yeah, exactly. And so he was like, well, I do have the sleep tracking app. So he was like, I'm going to send you a screenshot of my sleep health each day. And you're going to score my stories on a scale of one to 10 of how funny they are. So I am running a study currently. We're two days in on hilarity of his Instagram stories as a result of no sleep. But that's another example of like, even if I did really do that study, and maybe I will, because as I talk about that actually would be really fun. So but then it's like my subjective idea, whatever. Sure. So I do the study, you would do a teeny tiny study that maybe I track him for a year, I subjectively apply these scores, he sends me his sleep data, you know, whatever. He's only sleeping six hours a night. So obviously, his stories are very funny. Damien Chadwick on Instagram, get involved.
[1:10:16]But it's like, I would release that study a year from now. And the headlines would read something like, Like the secret to being funny, don't sleep ever again. You know, like it would just, it would go from this very specific, nuanced, narrow, possibly pointless, but maybe slightly interesting thing to something where it's like you have cults of only sleeping two hours a night in the name of being the next great. Exactly, exactly. You could just completely blow it up. And so that's what happens with all of this stuff. It's like science is either taken as capital T truth, which it's not, or it's taken as a whole bunch of lies peddled to us by the government trying to get us back to it, which it's also not, right? I mean, it's... There's something in between. There's something in between, yeah.
[1:11:01]You bring up being funny, and I mean, it's obvious to me how data science and stand-up comedy and being a circus performer correlate, you know, why you would do all three. But maybe for people who haven't made that jump and aren't teachers, can you explain how you do all three of those things? Right. I would be happy to.
[1:11:21]I would love to hear what you think is the correlation between those three things because my therapist and I can't figure it out. But I like to think that each one helps satisfy a different part of my brain. And I think this is probably true for many people, right? It's like you're stuck on a problem, like go for a long walk or, you know, I don't really do ball sports because I'm not coordinated enough for that. So some people play soccer. I just decided in grad school that I was going to be pursuing mental health by hanging from stuff that was on the ceiling. And so I got into circus actually in grad school. I did dance growing up and always did that kind of movement stuff. But I got into circus in grad school truly as a coping device because the main, I don't know how many of your listeners have gone through graduate school, but particularly PhD programs, the goal is to make you feel bad about yourself. I'm convinced that that's the goal.
[1:12:11]And boy, I succeeded. I felt so bad about myself that every Sunday I would go hang from random crap in a warehouse in Detroit. I went to University of Michigan and that would like rehabilitate my self-esteem. So circus became this very like it's playful. It's exploratory. There's no wrong answers. No one cares about your P values and your study. Like it was just lovely to go somewhere and talk to people who are not political scientists as much as I love political scientists and then comedy is sort of in between kind of to me connects those worlds so circus is you know it's very disciplined and there's a lot of kind of physical pain but it keeps you out of your head right okay right data science and teaching and talking about data science and doing data science is very heady it's only your head right stand-up and improv to an extent i'm not so good at improv but but comedy is kind of in between you're in your body it's a live performance you're doing things in the moment you're reading the room and it's you're on your feet, literally, you're standing up, and.
[1:13:08]It's totally your head as well, right? Because like you think of your favorite comedians, they're absolutely brilliant. They're quick, they're sharp. They can think of things on the fly. The crowd work is dynamite. They notice things in the world. Oh, that's so true. Why didn't I ever think of that? You know, that kind of stuff. So they have the minds of scientists, but like the bodies and like the room awareness of like a circus performer. I never thought of those in that connection before, but it kind of, it brings together the performance and the thinking, I think, for me. You know, here's something I think you're going to love. you know, Alan Alda is right. Yeah. Alan Alda has something called the Alan Alda Center for Communicating Science. It's at Stony Brook University. And what he does is teach scientists to do stand up to do improv.
[1:13:51]Does he? I didn't know that. Yeah. And he does this because he says that if you can get good at improv, you can change the way you're trying to explain science to somebody and make a better impact because you're adapting to what they're doing instead of just being, I have these rote things I want to say. And it's a real odd thing. It's actually like a foundation sort of a thing. He has a really great podcast, which is, I'm blanking on the name of it, but I'll come up with it in a second. Anyway, all of the money that he earns from doing the podcast goes to Stony Brook University and to this program. So I knew about the Alan Alda Institute, and I remember searching for it at various points when and I was like, who's doing a really good job of communicating science? Oh, look at this whole institute, it's amazing. But I had no idea they were doing improv.
[1:14:40]I strongly believe, that everyone should take an improv class obviously improv is classes are expensive and people have time constraints but if you can and you're looking for something new to do or just like try try something different in the new year improv i am not a good improviser i moved to stand up because i always wanted to do stand up but i was too afraid of being on stage by myself and improv you're on stage as a group and you have like you're playing games or an audience gives you a suggestion you make a scene based on this so it's interactive and so i was like this is safer. But on all the scenes, I would just stand there like with my arms crossed and like just speak my own point of view. And I was like, maybe I'm in the wrong discipline. But I did a ton of improv classes and it made me way more comfortable on stage for standup. And most importantly, it is the one thing that helped me the most when I became a teacher for the first time. When I was in graduate school, they don't teach you how to teach. They teach you the material. And then you work as a teacher, a TA or teacher assistant, but you're still, you're kind of just running discussions and you're kind of just grading. You're not really teaching your own class. The first time I ever taught my very own class, I was a postdoc at Carnegie Mellon. It was a class of only like 12 or 13 students. It was this like weird upper level, like international relations, audience costs related media thing.
[1:15:51]I was so nervous. I was like, I have no business being a teacher. I cannot be in charge of this class. What am I going to say? And I remembered, I was like, oh, in improv, I would go on stage with no plan and entertain in a room of audiences i mean i had other people with me but like with no plan and like do a 60 minute show and you get through it and i reminded myself of that as i would go into my classroom every day and i was like i have slides i have a plan like this is not as hard as improv and that reminder and then that skill set is what allowed me to teach and not freak out more than i already did so if you're looking to like ever just improve communication skills presentation whether you're a scientist or working at a company or whatever i cannot i don't know have you done improv it's I haven't but that makes me think it never occurred to me to try that.
[1:16:37]You've got to. I do enjoy making people laugh. That's for sure. Yeah. No, and it's so, it's playful and it's silly. And it's so rare as adults that we get to be around other adults and just be, like, goofy. And you're not, like, goofy, like, making faces and, like, silly. You're kind of, like, having scenes and you're exploring things. And it's very, you know, I think really good improv is as good or better than really good stand-up. The problem is the variation in improv is really, really high. Like, bad improv is really hard to watch. whereas bad stand-up is sort of interesting in its own way sometimes but but it's just so.
[1:17:13]I'm talking myself into taking another improv class it's just i've been trying to my my my partner is not he does not want to be in front of people he does not want to be on stage he does not want to be known yeah but i was like you've got to take improv just to get out of your own head a little bit oh steve would we were we were on a cruise one time with some friends and they had a karaoke thing and one of our friends got up and did karaoke and steve told me afterwards that to this day he gets like like cold sweats thinking of that if he goes to hell that is what he will be forced to do that is the last thing in the world he would ever do and i thought man that looks fun you know oh it was so i hate karaoke i'm with him on that is the name of the uh alan always podcast he has a vivid science and then just regular clear and vivid but they're both It's really science. Okay. Okay. But they're really good. Clear and Vivid, that's a really good name. I'm sorry. I didn't think of that. That's really good. He's got spectacular guests, and everybody he talks to is interesting. He has comedians. He has scientists, astronomers, just the whole gamut. It's really, really fun. Okay, cool. The last thing I wanted to ask you about was...
[1:18:22]On your website, jonesroy.com, and that's R-O-O-Y, just to make it harder to find. Yes. It says you're the founder of the data thinking revolution. What? What is that?
[1:18:36]It's a great question. It turns out you can be a founder of anything if you just say so on the internet. So it's not, you know, an LLC or an Inc or anything. It's just the thing I'm thinking about. it's basically my shorthand working term for everything you and I have just been talking about, which is when people think about data science and analytics and predictions and all this kind of metrics and stuff, they tend to think about, of course, the numbers themselves, which are very important. And, you know, what programming language should I learn? Like, should I learn Python? Should I learn R in my Tableau? We get hung up on the tools of it. And there's sort of an arms race in terms of like, well, I know Java and C++. And you're like, Well, okay. And that's very valuable. And it's a huge part of how we're making really interesting discoveries. And there's a lot of interesting work, particularly with the computation and improving our abilities to make causal inference, like we were talking about earlier. But there's a whole other chunk of data science, which is basically thinking critically and thinking like a scientist and going through the exercise, like we talked about with you and your theory about electric vehicles, which is like, oh, this is an interesting pattern. Yeah. I wonder what it could mean. I wonder what could explain it. Here are a bunch of different narratives that could tell that story. How do I discern between those? How do I do future studies? What are the other things that could be going on besides causality in this world? All that stuff. And we haven't even talked about a lot of like, how is this measured? Where did this data come from? Is this data even picking up what we think it's picking up?
[1:20:04]All of that thinking that happens around making discoveries with data is what I'm calling for now the data thinking revolution. This is why I'm so upset that he thought of clear and vivid, because that's so good. You know, I need something catchier, but that's what it is. It's basically trying to get people to, yeah, I just need Alan Alda's brain for the rest of my life. That's all I need. I love it. Well, can I be part of the data thinking revolution? You sure can. You are currently in charge of it, because you've watched more of those videos than I have. I'm not sure I watched my own videos all the way through. So I think you're leading the revolution, for sure. Well, they're wonderful. The podcast Behind the Data at BehindTheDataShow.com. The YouTube series was – everybody, just go watch one, and I dare you to stop after one. I'm so excited. I don't think you'll be able to do it. Yeah, this has been great fun. Is there any other way people can follow you, should follow you than what we've talked about? I mean, those are the big ones. I'm ever so slightly on Instagram, at Jones Roy with two O's as well.
[1:21:11]But I recently went through a phase where I decided social media was bad for me. So I torched all my social media. So it looks like I've been on the Internet for about 12 days. But I'm slowly putting I'll be putting some clips and stuff up, probably from the videos. So YouTube is probably the place to find those videos. OK, cool. Cool. Because I did look around for you on that. I'm sort of like a Mastodon girl. And so that's where I'm living over there with the nerds. And that's been fun over there. How is it over there? Is it good? Yeah, it's nice. There's no algorithm. So you follow people and then you see what they post and you can follow hashtags and you see what people post. And there's, it's a lot less Nazis too, as it turns out. So, I mean, if you want Nazis, you know, well, there's actually Nazi servers. Well, there's Nazi servers, but are people, servers that allow Nazis, but it's a federated system. So if I've got a, let's say I've got a Christian server, I can defederate the Nazi server and then nobody on my server will ever see that. But there's big servers where it's just, you know, mostly everybody's okay, but please no Nazis, you know. So just carve off the Nazis and then everything else is pretty much okay. But it's a whole lot less hate and less anger. And virtually every photo has alt tags on it for the blind. Yeah, it's really weird. It's a thing.
[1:22:34]People won't upvote you if you didn't put an alt tag on your images. Oh, I love it. Oh, that's great. Yeah. All right. You've persuaded me. I did the thing where I basically got rid of all the social media, but I didn't really put any back. So maybe that's where I'll start. It's nerd heaven. A lot of science stuff. It could be good. Yeah. Oh, cool. Okay. All right. This has been so much fun. I'm so glad we're besties now. Yes, we are. Yes, indeed. Thank you for watching everything and for the great conversation.
[1:23:04]Do you see what I mean when I said she is awesome? I had such a great time with her. I hope she can come on the show and talk about more stuff later. Maybe I'll ask her about what it's like to be a circus performer next time. I don't care. I want to talk to her all the time. Remember, there's a link in the show notes directly to that interview if you want to send it to somebody else because you're as excited as I am. And in the show notes, you can also find a link to the video series, Data Science for Everyone. And I hope you go even check out the first one, see if it's as fun as I said it was. But that's going to wind us up for this week. Did you know you can email me at alison at podfeed.com anytime you like. If you have a question or suggestion, just send it on over. Remember, everything good starts with podfeed.com. You can follow me on Mastodon, podfee.com slash Mastodon. If you want to listen to the podcast on YouTube, where do you want to go? Podfee.com slash YouTube. If you want to join the conversation, you can join our Slack community at podfee.com slash Slack, where you can talk to me and all of the other lovely Nocella castaways. You can support the show at podfee.com slash Patreon, or with a one-time donation at podfee.com slash donate. If you go there, you can pay with Apple Pay or any credit card. Or if you like PayPal, podfee.com slash PayPal. And if you want to join in the fun of the live show, head on over to podfeet.com slash live on Sunday nights at 5 p.m. Pacific time and join the friendly and.
[1:24:21]Music.