NC_2025_03_09
This episode features insights from Hulu's Oscars livestream, a discussion with Eric Essen on iFixit's FixHub soldering kit, ZBeck's modular workstation, and the role of AI in education, emphasizing community collaboration in tech.
Automatic Shownotes
Chapters
0:00
NC_2025_03_09
0:20
Thank You, Claude! – DTNS Live 4967 – Daily Tech News Show
0:55
CES 2025: iFixit Smart Soldering Iron and Electronics Toolkit
11:36
CES 2025: XEBEC Portable Snap-On Monitors
17:17
Sorry Steve, I’m Having an Affair with Claude
31:53
CES 2025: Maono Podcasting and Lavelier Microphones
36:01
Support the Show
36:37
CCATP #810 — Steve Mallard on Evolution of Music Tech
Long Summary
In this episode, I delve into a wealth of technology topics, showcasing my recent contributions to the Daily Tech News Show Live. We explore significant technical challenges including the highs and lows of Hulu’s Oscars livestream and innovations in laptop technology showcased at the Mobile World Congress. I also share personal updates, like finally shedding our landline after 42 years, allowing room for more adventurous tech engagements.
A highlight of the episode is my compelling conversation with Eric Essen from iFixit at CES, where we unpack their groundbreaking consumer-ready soldering kit, the FixHub series. We discuss its user-friendly design, featuring the first USB-C powered soldering iron capable of rapid heating, adjustable parameters via web interface, and a complete toolkit for novices and professionals alike. These innovations encourage accessibility to soldering, challenging common fears around the craft while showcasing its potential for repair and reuse over the throwaway culture.
Additionally, the episode features an impressive modular portable workstation from ZBeck, which gives digital nomads a solution to expand their screen real estate without compromising portability. With a focus on tech that caters to a modern mobile lifestyle, we explore the unique configurations and connectivity options that enhance productivity for professionals on the move.
Shifting gears, I dissect the evolving role of AI in education. The conversation reflects on AI's potential to engage users in creative ways, as well as the ethical implications of technology in shaping teaching methodologies.
As I wrap up, I share personal anecdotes about my learning journey with AI coding tools, including both frustrations and successes with newer models like ChatGPT and Claude from Anthropic. Highlighting my preference for Claude due to its responsiveness and clarity in problem-solving, I take listeners through recent coding challenges, detailing interactions that illustrate AI's evolving capabilities in assisting developers.
Finally, I bring the focus back to live technology discussions, providing engaging commentary on notable tech innovations and collaborative efforts within communities inspired by repair and sharing. The show wraps with a personal touch, inviting tech enthusiasts and amateur creators to join the conversation, reminding everyone of the community's strength in sharing knowledge and supporting one another in our technological endeavors.
A highlight of the episode is my compelling conversation with Eric Essen from iFixit at CES, where we unpack their groundbreaking consumer-ready soldering kit, the FixHub series. We discuss its user-friendly design, featuring the first USB-C powered soldering iron capable of rapid heating, adjustable parameters via web interface, and a complete toolkit for novices and professionals alike. These innovations encourage accessibility to soldering, challenging common fears around the craft while showcasing its potential for repair and reuse over the throwaway culture.
Additionally, the episode features an impressive modular portable workstation from ZBeck, which gives digital nomads a solution to expand their screen real estate without compromising portability. With a focus on tech that caters to a modern mobile lifestyle, we explore the unique configurations and connectivity options that enhance productivity for professionals on the move.
Shifting gears, I dissect the evolving role of AI in education. The conversation reflects on AI's potential to engage users in creative ways, as well as the ethical implications of technology in shaping teaching methodologies.
As I wrap up, I share personal anecdotes about my learning journey with AI coding tools, including both frustrations and successes with newer models like ChatGPT and Claude from Anthropic. Highlighting my preference for Claude due to its responsiveness and clarity in problem-solving, I take listeners through recent coding challenges, detailing interactions that illustrate AI's evolving capabilities in assisting developers.
Finally, I bring the focus back to live technology discussions, providing engaging commentary on notable tech innovations and collaborative efforts within communities inspired by repair and sharing. The show wraps with a personal touch, inviting tech enthusiasts and amateur creators to join the conversation, reminding everyone of the community's strength in sharing knowledge and supporting one another in our technological endeavors.
Brief Summary
In this episode, I discuss key tech topics, including insights from the Daily Tech News Show Live on Hulu’s Oscars livestream and mobile laptop innovations. A highlight is my chat with Eric Essen from iFixit about the FixHub series, a user-friendly soldering kit aimed at making soldering accessible for all. We also explore ZBeck's modular portable workstation for digital nomads and examine AI's role in education along with my experiences with AI coding tools like Claude. The episode wraps up with reflections on technology’s impact and an invitation to foster community collaboration in our tech endeavors.
Tags
tech topics
Daily Tech News Show
Hulu
Oscars
mobile laptop innovations
iFixit
FixHub series
soldering kit
ZBeck
modular workstation
digital nomads
AI in education
AI coding tools
community collaboration
Transcript
[0:00]
NC_2025_03_09
[0:00]Hi, this is Allison Sheridan of the No Silicast podcast, hosted at Podfeet.com, a technology geek podcast with an ever-so-slight Apple bias. Today is Sunday, March 9th, 2025, and this is show number 1035. Well, the live chat room is all heated up with everybody complaining about time changes, but we're here to do the show.
[0:20]
Thank You, Claude! – DTNS Live 4967 – Daily Tech News Show
[0:21]This week, I got to be on Daily Tech News Show Live, which is a new hour-long video forward format for the show. After a quick hit of the news, we talked about the disaster that was Hulu live-streaming the Oscars, or not live-streaming it, some interesting concept laptops from Lenovo coming out of Mobile World Congress, how Steve and I get rid of our landline after 42 years, and how some people are actively using AI to better engage and teach their children. Look for episode 4967 for March 3rd, called Thank You, Claude, at dailytechnewsshow.com, or follow the link in the show notes.
[0:55]
CES 2025: iFixit Smart Soldering Iron and Electronics Toolkit
[0:59]When you go to visit iFixit at CES, of course they're going to be soldering at their booth. What else would they be doing? I'm here with Eric Essen, and he's going to talk to us about their soldering kit and some of the other tools. What did you say? You were the chief tool? Chief tool officer. Very good. And I understand, too, that you guys have owned some of our tools for a long time already. Of course we do. Yeah, thank you for that. Yeah, this is FixHub, our new soldering line. And we want to make soldering accessible and a lot of people are afraid of soldering and to approach soldering and don't think they can do it and a lot of that's because soldering at irons out there you know they're 30 watt ones they don't heat up well people don't have flux they don't have all the right equipment to be successful in soldering okay so we're working to put together an environment where they can be successful so this is the first usbc consumer ready soldering iron. There's some out there that are like community projects, but they're not UL approved. They're not CE approved.
[1:58]They're not retail ready. So this is a hundred watt soldering iron. It heats up in less than 50 seconds. You can buy the standalone soldering iron. It's a $79.95 and that can plug into any web interface. You can adjust all the parameters and stuff on it, or you can buy it with the power pack as well. The power pack and the soldering iron are $249.95 or for another $50 for $299.95. We do a full kit that comes with everything including safety glasses, a heat mat, flux, solder, a silicone electrical tape, a full ready-to-go-in-the-field soldering kit to tackle any job. That's fantastic. Now, I'm going to ask a real dumb question. When you say USB-C, do you mean USB-C powered? Yes. Okay. So, it doesn't, you can plug that into a Samsung or whatever phone charger. I've actually used it plugged into my MacBook.
[2:49]The MacBook doesn't give it the full 100 watts it would like, so it doesn't heat up in five seconds, but it will still work plugged into any USB port that puts out enough wattage. And if it's a USB-D, where it can do the power delivery communication, then it'll get the full power, and you'll get that five-second heat-up time. That's very, very cool. I love that. Yeah, and the soldering iron is actually just the first tool in our FixTub series. The other ones are still under embargo. I can't talk about them yet, but we will have additional tools in the future that I'm really excited about that will complement it very well. Can you set the temperature of the soldering iron? Yes, yes, yes.
[3:31]Everything... You said that was from a web interface, right? Yeah, so you can do it with the PowerPack on the PowerPack, but you plug it into anything with a web browser and you can do it in the web interface as well as there's an Android app. So you can plug it into an Android phone and adjust the sleep time. It's got fall protection with an accelerometer. Everything in it you can tweak and adjust however you prefer. But if you're on an iPhone, you can't do that. Well, you can just plug it into anything with a... Yeah, an iPhone has a lightning port. I don't have a... No, they have USB-C.
[4:06]They have for a couple of years now. Oh, okay. Yeah, I don't have... I don't, you know, I don't, I'm not on, I'm not on a new enough iPhone, so. Oh, no, I didn't mean to shame you. No, I need shaming here. Yeah. Okay, so you've got an interface in any case that you can get to those changes. Yeah, you can always, yeah, no, that's really important to, we don't want to require someone to buy something to do that. Yeah. No, the soldering iron is meant to be a standalone tool, and then there's the bonus of the battery pack you can buy, and the battery pack supports over eight hours of continuous soldering. Nice, nice. What else did you want to tell us about today? So we've got the ProTech Toolkit. This is an industry standard toolkit. You guys, I understand, own the first version in the gray roll. Yep. This has been out for 15 years. We've sold 1.5 million over that. We're not sure exactly. So this is basically, I'm telling people of the audio podcast, we've got little bitty tiny screwdrivers and nut drivers and little, the spudgers, man, the spudgers are gold. Actually, here, let me show you real quick. So, for example, these are triangle bits. These fit McDonald's toys. These also fit almost anything with a heating element. Hair dryers, curling irons. I don't think we got that back in the gray roll, did we? Yep. Yeah, no, we did have the triangle bits in the gray roll, I think. I've got to go look more closely. Yeah, it's been, it's been, that kit's about 13 years old now, I think. So, we've got the pentalobe. This bit right here. The old pentalobe. Yes.
[5:35]This bit right here is an oval bit. It's only used by one or two European coffee, like really high-end coffee machines. Almost nobody ever uses it, but the idea is this 64-bit kit has the bit to fit every single screw used in consumer electronics ever. What is this one that's got like a little U-shape thing on it? Oh, yeah. Next time you're in an elevator, look at how the elevator panel is screwed on. Actually, at the office, when the elevator tech used to come, you have to have an annual elevator inspection. They used to always ask, hey, do you have any more of those bits?
[6:13]Yes, I do. Eric's got one in his back pocket, right? So this is the most comprehensive kit. Right. We surveyed, you know, I've read every single not five-star review that it's ever gotten. I have read every single minor customer, anything that it's ever had over the years. There's not. But, you know, anytime there's been a warranty claim on a tool, you know, I paid attention to that. And then we also surveyed our customer base and got over 20,000 responses from our customers. And we asked them what tools you use the most, what tools don't you use. Like nobody normally uses the anti-static wrist strap unless you're working in a clean room or certain professional environments. But the normal consumer never uses the anti-static wrist strap. I'm also probably not going to work on the elevator. Probably not. Or I wouldn't have it with me if I needed it. So we took all that, and then we've got over 100,000 repair manuals with all the tool data. So we crunch all that tool data to actually know what bits are being used the most and are in the most devices. And that's what we use to trim down to create the new ProTech Go. And this is our pocket-sized kit. This will actually fit in a standard blue-jean, pocket boys pockets yes boys pockets well girls don't even get pockets half the time exactly.
[7:29]So this this gets you down to the the most common most common this does 99 of what that does, and it's you know uh a third the size right right so the protect go and how much is that it's uh 49.95 nice and uh we timed the launch this is already available right now in best buy and Micro Center, and will be on other retailer shelves shortly. Very good, very good. All right, well, this is fantastic. I love the company. I love the guides. I had a great time rescuing. I had one friend who had a MacBook that had a bad display, and another friend who had a good display, but the Mac was shot, and I took both of them apart, and I made one live. It was one of my proudest moments. Thank you for saving a device. That's the most important thing. There would have There would have been two MacBooks in a landfill. That's right, that's right, yeah. That's actually how iFixit started. I don't know if you know the original story.
[8:24]Kyle, our founder, he broke the hinge on his MacBook. And he contacted Apple and said, can I buy a hinge? And they said, no, we don't sell parts. And so he bought a used MacBook, on ebay took the hinge out he needed and then he listed the rest of the parts on ebay and quadrupled his money so then just started buying macbooks and taking them apart well actually back then there were power books right uh the original name was pb fix it, and then we rebranded as i fix it to be universal for all devices right right right um but yeah no we started out actually and uh when i first started at i fix it the first couple hours of my day often was just disassembling to sell for parts macbooks and market yeah classifying all the parts and we were almost like a macbook junkyard uh in the early days so we've grown to the part point now where um we manufacture a lot of parts ourselves we have our own battery uh you know battery lines going and things like that the guides that you give away for free do so much for the community to feel empowered and to have a good feeling about the company I mean, I feel good because I bought your products and they work and I bought parts and things. But you've got so much goodwill in the community because of the work that you guys do on the guides and the teardowns and everything. What is that little part there? This great stuff. Great work. Yeah. Well, we wouldn't have those guides if it wasn't for the community also. Like, we do write a decent percentage of the guides, but I fix it's like Wikipedia. yet.
[9:50]And there's hundreds of hobbyists who have their very specific specialty that they're into, whether it's early 70s Mercedes cars, and they're writing guides all the time, as well as thousands of university students are going through our technical writing program all the time. And their assignment in technical writing by all these universities is to write repair guides for iFixit. So not only are they learning, but they're also giving to the community and creating content that helps keep devices out of the waste cycle. That is fantastic. I'm glad we kept chatting past the products. I didn't know any of that. Well, thank you very much, Eric. Thank you. People can find this at iFixit.com. Very good. Thank you.
[10:36]IFixit has been such a household name and being able to repair your own stuff for such a long time, and you can tell I'm a big fangirl. But I got to tell you a great story from the show. About an hour after our interview with Eric from iFixit, Steve noticed that the audio cable that connects the digital mic receiver to his video camera was actually fraying. He was concerned that it might fail while we were at CES. I immediately said, hey, I bet iFixit would have some shrink sleeve tubing that he could apply to it for you. Steve pointed out that the connectors on both ends were too big to accommodate a sleeve. I said, we should go ask him anyway. We ran over to Eric. He took one look at it and he said, I've got just the thing. He whipped out a roll of their self-fusing silicone electrical tape. He cut a little bit off. He stretched around the cable, wrapped it around, and explained that stretching it holds it tight and helps it to securely bond with itself without any sticky gooey adhesive. It was a perfect fix to the problem. I love this story because it shows you the value of iFixit.
[11:36]
CES 2025: XEBEC Portable Snap-On Monitors
[11:40]Well, who doesn't need more screen real estate, especially when they're a road warrior? So I've stopped to talk to Alex Levine of ZBeck, who might have a solution for you. Absolutely. If you are a digital nomad, a traveling professional, or really anyone working on a laptop in this day and age, you always need more screen space, but you don't want to sacrifice productivity for portability. So that's what we've got with the Z-BEC Snap. This is our modular portable workstation that allows you to recreate the productivity of an office, but on the go. You can see there's nothing that's plugged into the wall. Everything is driving entirely off the battery of your laptop. So I'm going to explain a little bit. This is an audio podcast as well as video. So what I see here is a MacBook that's got a vertical display somehow connected on the right and a smaller one across the top. Well, maybe the same size, but across the top. Exactly, and the way that it's connected is in a really unique way. So we start with our ZBEC snap bracket. What this bracket does is it expands, and it fits any laptop from 13 to 18 inches, and then it hugs the side and edges using this tensioning mechanism. So there's no glue or permanent adhesive or anything that attaches to the back of your laptop. It's just as it was out of the box. After that, these wings pull out from either side. And then we have a kickstand for additional stability.
[12:58]I was going to ask about how's that all not flopping over. Yeah, we've got a kickstand built in, and then these cables pull out, and you can either plug them in with USB-C, like I've done here, or even with USB-A, as Zach has done over there. Around the Windows machine. Yeah, and that also, if you've got an older, older Mac, it'll work. But I'm a Mac guy, and that's why you don't need a PC, right? Because we're done with USB-A. We're all USB-C. So I'm a little confused. If this is a MacBook Air, will these both fit over to the left side? Yes. So you can see how I have this going over here from the left side right now. The way that it works is you just pull this out, and then it fits all the way around and in. And so that's what we've shown here is the left one goes all the way around. Now, a normal MacBook Air would only allow for one additional display natively, but we have display link chips, the DL6000 series chip baked directly into these displays so with a display link free display link driver download you can run all your monitors with just that connection so I've heard about display link I've never seen it in action this is very exciting for me yeah it's great it's you know when Apple started making their own chips and left Intel it's frustrating when a 2016 MacBook can run four displays and now just one with the newer ones but with the display link built inside it It really allows you to bypass that and still get your multiple monitors.
[14:19]But you're not stuck to just this orientation, right? So you can take these displays off. Oh, he's just popping them off magnetically. With our Zbeck Snap patented technology, which is a combination of pogo pins and magnets. And I can go portrait over here. I can take this monitor off.
[14:39]And I will go landscape on this side as well. And we've got this nifty little app that we built that allows you to skip going into your display settings and orient everything exactly right now the snap is on the left and the right both in landscape format but one of them is upside down and one of them's vertical but I'm gonna pop open a little driver a normal issue yeah if I can navigate where I am and there you are there you are I can see you over here on the right one so yeah it's hard because the dock is sideways because he flipped it but all right so we know we're here we've We've got this little app, I'm going to go landscape. He's going to go tell it to orient itself, there we go. And then we do the same over here on this, we say landscape. And now we're all set up with TriSkin, exactly.
[15:24]That is really cool. What is the size of these displays? These are 13.3-inch displays. They run at 1080p, so full HD. And the big feature that we optimized these for was brightness. So they run at 400 nits of brightness, which is actually rather bright.
[15:41]If you are someone who works on a lot of spreadsheets, if you are a developer that's compiling a lot of code, if you are doing any sort of document review or research, This is the perfect companion for you because it all breaks down into an easy-to-transport package, just like this. You have your two displays on either side. You put your bracket right in here. It looks like a laptop sleeve in here, and that's it? Yeah, and then this fits right along your laptop and your backpack, and your whole office is there ready to go. Wow, that is fantastic. So how much does this cost? So our Snap Tri-Screen in this configuration you see is $9.99. The way that Zach has it over there with just the dual screen, so one monitor, is $549. And then we have a bunch of different ways in which you can customize your setup with our stack-up mount, with the light, with the phone charger. And that'll add anywhere from $40, $50, $60 to $100, depending on your configuration.
[16:37]So it's really kind of customizable to the work that you do. So $500 to a little over $1,000. Wow, that is really slick. And where would people go to find out more about Zback? We would recommend just Googling us. So X-E-B-E-C, and our website is thezebeck.com. But yeah, come check us out. We've been in the portable workstation space for a little over six years now, and this is our latest and greatest attempt at it. Not attempt, success at it. Success at it, yeah, our take at it. But no, we're really, really proud of what we built here, and we think that we're that much closer to perfection, but always making it better and better and excited what we released this year. Very good. Thank you very much, Alex. Yeah, absolutely. You too. Thank you.
[17:17]
Sorry Steve, I’m Having an Affair with Claude
[17:21]Like most developers, I've been using AI to help me with my little coding projects. In case you doubt that most devs are using AI, way back in June of 2023, which is a long time ago in AI years, Inbal Shani reported on the GitHub blog the results of a survey of developers. They wrote, a staggering 92% of US-based developers working in large companies report using an AI coding tool either at work or in their personal time. See, I'm in good company.
[17:50]Back in the day, I used GitHub Copilot when it was in beta, and I found it helpful, essentially as a fancy autocomplete. If I started to type something that was going to be maybe repetitive, it would automatically offer to finish up the pattern for me. Such simple times. Lately, I've been paying for ChatGPT+, the $20 a month one, so I get access to the voice version of ChatGPT, and so I'm not limited on how many questions I get to ask it. I've also used it quite successfully in a lot of ways, but not so much for coding. They recently came out with their O3 Mini Reasoning model, which is supposed to be good for coding. But I gotta say, I find this model really annoying.
[18:30]Were you ever forced to go to an active listening class? You know, the thing where the instructor repeats back to you whatever you said to show that they understood you? What I hear you saying is so annoying. Well, the O3 Reasoning model from ChatGPT is just like that. Let me give you an example. You don't have to know anything about programming at all to see why this is so annoying. I prompted ChatGPT with the following sentence. In the next message, I'm going to give you some JavaScript. It works to return the correct total, but it throws an error saying that it can't find the variable getCellValue. I expected to immediately give it the code, but before I could do that, it reasoned for a full six seconds writing this. The user's message gives some context. The JavaScript works to return the correct total, but throws an error about not finding the getCellValue variable. I think the right next step is to wait for the user to send the actual code in their next message, since they mentioned it's coming. I shouldn't ask for it from them again. Just let them share the code so I can help troubleshoot that specific issue. The user said they'll send the JavaScript in the next message, so I shouldn't jump into an analysis yet. Based on the error they shared, it seems like git sal value might be out of scope or not defined, but I don't have the code yet to confirm this. I'll need to wait for the code. So I should respond by saying, OK, I'll be happy to help once you share the code.
[19:51]Can you believe that? I wanted to shoot myself. I won't put all of it in here, but it actually got worse over time. After I uploaded the code, it reasoned for 12 seconds, writing even more blather before finally coming back and explaining my error and what I could do about it. Now, remember my article about how I was using AI to create transcripts of our CES interviews? I used MacWhisperer to create the initial transcript from the audio recordings, and I'd written a prompt for ChatGPT, which would get it to return a markdown file with the segments bunched into paragraphs by speaker change and having assigned the names of the speakers. It was magical, is what I said. Until it stopped being magical. As I explained in the previous article, I created a text expander snippet of the exact prompt that had worked successfully so I could get consistent results. The text expander snippet asked me to identify if the other voice is a man or a woman and to identify the second speaker's name after telling it my own. Sadly, the results from ChatGPT just started to degrade. It seems to no longer understand the concept of grouping sentences by the same speaker into paragraphs. Instead, the markdown output gives me individual sentences with the name at the beginning. If it's someone says, say, five sentences, you have to read their name five times, too. It takes up a lot more space on the page. That was bad, but even worse, it seemed to get weary and sometimes just wouldn't finish the transcript. These things aren't that long. They're like three to five minutes.
[21:20]Sometimes they would just truncate it, but other times at the end of the file, it would say something that conveyed that it honestly just couldn't pay attention that long, so it gave up. I found I was having to ask it three, four, and five times to finish the job. It was discouraging, to say the least.
[21:36]Around this time, I was watching Daily Tech News Show Live, and Molly Wood was talking about AI. She said she's been using Claude from Anthropic for privacy reasons. I'd heard of Claude, but I didn't know anything about it. I did a wee bit of the Googles to see if I could find a comparison of the privacy policies of Claude vs. OpenAI, the owners of ChatGPT. I found an extensive article on Medium by someone who goes by Michael Alexander Riegler, where he compares in detail the privacy policies of Claude, ChappieGPT, and Gemini by Google. I was curious who Michael was, so I followed his links to his profile on GitHub, where he explains that he's a researcher and head of AI at Simula, and he's dedicated to advancing AI for social good, focusing on transparent and trustworthy AI systems. I learned that Anthropik's privacy policy has several advantages over ChatGPT and Gemini. By default on Claude, user data is not used for training. With ChatGPT, you can opt out of having your data used for training, and Gemini uses your data quote-unquote for improvement. Another comparison Michael made was the clarity of the privacy policies of the three companies' products. Claude is rated as high for clarity, ChatGPT is low in clarity, and Gemini is medium. I've linked to Michael's full breakdown if you're interested in reading the details.
[22:57]Now, Molly Wood knows a lot, so I was likely to try out Claude anyway on her recommendation, but getting confirmation of her assertion of better privacy sealed the deal. The first major task I gave to Claude was to create the most recent Markdown transcript by speaker that ChatGPT failed to create correctly and completely. I gave it the text expander snippet, and Claude did a flawless job. I've since then given it several new transcripts to reformat for me, with names, and it's done an outstanding job every single time. There are occasional misinterpretations where maybe it didn't accurately identify the speaker from the context of the text, but a few here and there are easy to fix. Point number one for Claude. But my real love affair came when I was working on some code. I'm going to get a little bit nerdy here, but please bear with me. I have a little web app that allows you to add and subtract elapsed time. That's something that spreadsheets can't do. If you don't believe me, try adding 23 hours plus 2 hours. You won't get 25, you get 1 a.m. because spreadsheets work in absolute, not elapsed time. Anyway, my web app is fully functional, but it's also in active development. You can use it by going to podfeet.com slash timeatter. In its current form, it adds the hours, minutes, and seconds, and you can export to comma-separated value, also known as CSV, or you can now export it to the standard ISO format of hours, minutes, and seconds.
[24:21]HH, MMSS separated by colons. Anyway, it's pretty slick, and I use it all the time. It is a little clumsier if you want to subtract time. To do that, you have to put in negative hours, negative minutes, and negative seconds for the entire line of one entry. The active development part is I'm adding buttons to the end of each row for plus or minus. The buttons will default to adding the current row, but they allow you to toggle an entire row to negative to subtract that row from the total. I'm having great fun working on it, and as Dorothy taught me, I created a little test file in which I could experiment without the complexities of my project. I successfully learned how to style the plus-minus buttons using a tool we use called Bootstrap. Bootstrap will automatically style them in a color of your choosing, and when a button is selected, the color will change. Imagine a blue with white letters button when selected and then change into white with blue letters when deselected. In addition to making them pretty, I needed to create code that would change the class of the selected button to active. That way I can have some code that says, if class equals active on the minus button, change all of the signs of this row to negative before you do the math. Makes sense, right? You're still with me? Well, my nifty little test file where I have hard-coded in two rows of numbers, it works just dandy.
[25:36]After making my pretty test case, I started working on the real code. It's a bit more complicated because my Time Matter web app lets you add as many rows as you want. This means I can't hard code each row to exist to tell it how to style the buttons and to add the class active only to the one that's selected.
[25:53]Instead, Bart taught us in programming by stealth to use something called mustache templates. Think of them as like placeholders, like you could do in Microsoft Word when you're creating a form letter. You know, you could put in name and squirrely brackets, that kind of thing.
[26:06]Well, the format of a mustache template uses two squirrely brackets on either side. So like squirrely bracket, squirrely bracket, template, and close them. Now we'll get just a smidge more nerdy before I finally get to the point of this story, which is about how I'm in love with Claude. The elements in each row where you input hours, minutes, and seconds, or click a plus or minus button, they all have IDs. And for a given row, their ID ends with the row number. So the plus button on row one would have the ID plus button dash one. And minus would be ID minus button dash one. These IDs are created by these mustache templates. I need to know which button has been selected on each row so I can change its class to active. But that's not what was happening. When the buttons were rendered, the first row, which is hard-coded, looked dandy with its fancy blue buttons for plus and minus. But the second row, generated automatically with this mustache template, looked like it was drawn in Windows 98 with no styling at all. Worse yet, selecting either button didn't change how they looked. In other words, it was a complete failure. In web browsers, with the developer tools, you can inspect any element. So I inspected the buttons to see what they looked like, and they were all borked up. Instead of the third row's subtraction button saying id equals subbutton-3, class equals active, it said id equals, quote, class equals ampersand quot semicolon active, unquote.
[27:34]That's not right. It was like the class got kind of smooshed inside the ID, and then it had ampersand, q, u, o, t, semicolon instead of a quote. It just, this didn't make any sense at all. Oddly, all other elements in the same row that were also created using the same kind of mustache template for their row number, they were all rendering perfectly. Their ID and class were not intermingled. I worked for many, many hours on my own trying to figure out how this was happening, But eventually, I called a friend to help. That friend was Claude. At first, I described kind of in vague terms what was happening, giving it little tiny snippets of the code. Claude tried really hard to help, but it couldn't see the big picture. Everything it wrote back made sense, but I wasn't any closer to figuring out what was wrong. After quite a bit of discussion, I finally threw up my hands and I asked Claude, Hey, can I just send you just all the files and you just dig in there and find that little bit I'm talking about and fix it? Now is where you get the big payoff for paying attention to all of this nerdery. Claude immediately identified the problem. When I defined the IDs for the two buttons using the mustache squirrely brackets, I didn't put quotes around them. Once I added quotes, the pretty buttons appeared, and I was able to click them and have the class flip to active. Yay!
[28:51]Now, if Claude was only able to identify my mistake and how to fix it, we would have become close friends. But Claude is in my heart now because of what happened next. I asked it, why did all of my other IDs created by mustache templates, why did they not get booked even though they weren't quoted either? It didn't do a bunch of faffing about. It didn't give me a bunch of active listening. It didn't beat around the bush. It explained to me that in my other uses of mustache templates to create the IDs, they were less complicated. That allowed the browser to be forgiving of my error because it could interpret what I meant. Now, this explanation taught me how to code it properly. Claude didn't just spoon-feed me the answer. It told me why this would work and why it was important to fix it on all of these IDs. I can guarantee you, I will never forget to quote my IDs created by Mustache templates ever again.
[29:44]Now that we're through all the goopy nerdy stuff, let's recap why I have a new crush named Claude. My experience working with Claude is like having my best friend sitting right next to me in the same time zone. I got to do this once with Helma. She came out and we sat in the kitchen and worked side by side. And it felt like that because Claude was right there, but in the same time zone, not in the Netherlands, you know, Claude was right there in the room with me. And my best friend, Claude, has nothing to do better than to help me. I mean, he got nothing to do. Claude doesn't have to go to work, doesn't have to take his kids to school, he doesn't have to exercise. Claude knows all about coding, knows all of the correct terminology, and yet he doesn't make me ask my questions with the correct terminology. Because I'm always saying things like squirrely brackets, you know? But he's just fine with that. It works. Evidence that I'm emotionally attached to Claude is in the way I talk to it now. I asked it a question about a redirect I created on my web server for my TimeMatter app, and it offered several reasons why I was getting a 404 error.
[30:47]Of its many solutions, it identified a possible syntax problem where I used a hyphen instead of an equal sign, and that was the problem. Now, once I fixed it and the redirect was working, I could have just quit the Claude app and moved along. But instead, I took the time to tell Claude which of its answers was the one that solved my problem. I was rewarded by a response of joy from Claude and additional information on syntax for matching redirects. Now, I know Claude doesn't learn from what I'm doing, but I just felt like, you know, he deserved to hear the answer which one was right. Now, I used Claude a few more times that day, and then I ran out of free queries. Even though I've been paying $20 a month for ChatGPT, I shelled out $20 to get the pro access to Claude for a month. It's not perfect at everything, but for programming, it's stunningly useful. The bottom line is that my love affair with Claude may just be a spring fling. It appears I'm quite fickle when it comes to my assistance, since it wasn't that long ago I was singing the praises of my friend ChatJPT. Just like having a prenup, I did not sign up for a full year with either of them. But for now, my heart belongs to Claude.
[31:53]
CES 2025: Maono Podcasting and Lavelier Microphones
[31:57]We were walking by a booth here called from Amaono, and it says it's the number one best-selling internet microphone, and I think we better know about it if it's that popular. So I'm here with Tobias Hollenstein from Moano. I'm saying it wrong again. Say it, right? Yeah, Moano. Yeah, you're almost right, yeah. All right, you've got a lot of microphones here, but we're gonna try to do a quick tour to get an idea of the offerings you have. Yeah, sure, let's take a quick look at the podcasting mics. Right here, this is our best dynamic microphone. It's both XLR and USB. This is called the Moano PD400X.
[32:32]And then if we make our way around here we have another dynamic microphone, this would be just like one step below that one this is called the mono pd 200x it's also both xlr and usb actually most of our dynamic microphones are both usb and xlr so what's the difference between these two microphones this one i believe in my opinion he's pointing at the 400 yes i'm pointing at the 400 over there which is the more expensive one i'd say that one has maybe a more of a warm sound compared to say the 200x which is this one right here. All right you have them both plugged into an interface here. Yes this is the Mono PS22 audio interface. This is our best audio interface that I would usually recommend for people interested in podcasting. It has individual phantom power for both inputs in case you are using a condenser microphone but it also supplies enough gain to power these dynamic microphones. And what's the price point on the PS22? $120. That's not bad. That's really nice. It's got pretty blinky lights, too. Yeah, yeah, yeah. And what is the price point on these two microphones, the PS200 and 400? The PD200X goes for around $70, and then the PD400X goes for about $150. Those are pretty competitive prices. Now, we've got a third one over here. We're going to walk around the table. Keep going.
[33:51]And now this one is more of a rectangular odd-looking shape this is this is kind of fun and it's got colorful lights on the back what is this yeah so this is the PD100X this is going to be the least expensive out of the three I would recommend this one maybe more for gaming but it could also work for podcasting as well okay and how much is this one this one I believe is going for 55 right now okay that's very cool looking so it's got to be gaming because it's fun on the back right yeah all right now you've got some lavalier mics over here I wanted to take a look at we're We're going to keep Steve moving as much as possible as we go around here.
[34:25]Yeah, we have several different lav mics here. This one right here, this is the T5. It comes with a charging case. You can use these microphones for up to eight hours and with the charging case up to 20 hours.
[34:38]I wonder whether, oh, Steve still has it on. I'm going to make sure this gets back in your pocket. We took that off. I noticed there's one missing. Yeah, so what we're looking at here is a little charging case, maybe a little bit bigger than an AirPods Pro case, with two magnetic lavalier mics that pop in here and looks like they charge in the case and I'm gonna leave that sitting there while I talk and then you've got a little piece that plugs into the bottom of the phone they've got a USB-C model and I think a lightning model right next to it here and we have a lightning model and also a 3.5 millimeter for camera okay so now the idea is you plug this little receiver into a phone and then what happens yeah you could use any recording app or even what I like to do is just record a video and then the audio will automatically be on your video. Oh, so you don't even need a specialized piece of software. Oh, that's cool. Yeah. Yeah. And what is this model called again? The T5. And how much is the T5? I'm actually not sure because I believe it's not even released yet. Okay. So we should look to the website to see when that gets released. You're not sure when it's coming out either? Probably soon, but I can't say for sure. But, oh, sorry. Our website and also everything's on Amazon. And your website is at? Mono.com. And you're going to spell that for everybody? Yeah, M-A-O-N-O.com. Very good. And on Amazon. Well, thank you very much. This was really interesting. Now I've got to check you guys out. Oh, yeah. Thanks for stopping by. I appreciate it.
[36:01]
Support the Show
[36:04]I don't know if you've heard of Zoe, who goes by the handle ZoeBringsBacon, but I can assure you, she's a delight. I first met her through Daily Tech News Show where she supports Tom Merritt and his team. She started joining the live show for the No Silicast of late, even though this show starts at 1 a.m. for her. She's also the newest patron of the PodFeed podcast. She went over to PodFeed.com slash Patreon and pledged an amount that showed her appreciation for what she learns on the shows. You can be delightful like Zoe by coming to the live show, or by going to Patreon and supporting the shows.
[36:37]
CCATP #810 — Steve Mallard on Evolution of Music Tech
[36:37]Music.
[36:45]You know I'm not a music person, but I've invited someone named Steve Mallard onto the show to talk about a topic he's quite passionate about, and it's the evolution of music tech. Before we get started, he wants me to tell you that he is not an expert in this field, even though he does sound pretty knowledgeable. It's just he finds the history really fascinating. He probably also wants me to tell you that I strong-armed him into doing a podcast episode with me. He's been trepidatious that he could pull this off, but I know he can. Anyway, Steve and I have been friends for about 30 years, but we lost track of each other for more than 15 years, and he just recently found me. Back in the day, Steve used to sell me Sun Microsystems workstations for the large corporation where I used to work. Now, if that isn't the oddest introduction, let's just go ahead and say, welcome to the show, Steve. Hi, Al. Thanks for having me. It's lovely to be here. And by the way, thank you for the disclaimer. I've listened to a few of your pods, and I know you have bona fide experts on your show, and I'm not one of those. I'm much more in the enthusiastic amateur category. So thank you. I made the big mistake of sending Steve the Andrea Gez interview, which is the number one thing I've ever done of the coolest person that I know. And so I probably should have started with some of my normal human level interviews. I was a little intimidated, it's true.
[38:04]You know, the story of how you found me after 15 years of having lost track was a little bit interesting, I think. How did that happen? I think you popped up as a Facebook friend recommendation. And what's interesting is, I don't think you and I have any friends in common on Facebook. So I guess the algorithms somehow, I don't know, through several degrees of separation, connected us. I think so. And went from there. I think that's crazy. And I hardly ever look at Facebook anyway. And I just saw a Facebook message. And boy, you send me a Facebook message, it could be three months until you'll hear from me. But I saw this one, it's like, Steve Mallard? What? So that's pretty fun. Anyway, Steve and Steve and I were out to lunch, and Steve Mallard started going on about a topic that I said he's passionate about, and it's the history of music tech starting way back with acoustic guitars. So where do you want to start on this, Steve? Well, you know, I was thinking I should probably also throw one of them into the mix, which is I...
[39:08]My background is technology, as you know, but I am also a musician of sorts. I grew up as a rock-obsessed teenager back in the UK, and this was in the era of those great 70s bands, the Led Zeppes and the Pink Floyds and the Black Sabbath and Deep Purple and those kind of bands. I know you have no interest in them, Alison, but maybe some of your listeners do. I actually went to a Pink Floyd concert when I was in college. Did you really? I'm impressed. I never have seen the Floyd live. So you have one over me. I saw the pig explode and everything in Anaheim Stadium. Yeah, I thought it was ridiculous and I was pretty bored. But anyway, I can do one. But despite having this rock obsession, I never learned to play an instrument as a kid. Because A, I was this geeky kid who spent way too much time studying for exams. And B, it just seemed way too hard. So I never got around to doing it. So a couple of years ago, I put that to rights by knuckling down and learning to play some instruments. So I started off playing guitar and struggled through that. I can play guitar now. I'd hate to say I'm any kind of master. I'm certainly no Jimmy Page or Jimi Hendrix, that's for sure.
[40:22]I say great across from there to play a little bit of bass guitar, tried some vocals. And most recently, I've been getting into drumming in quite a big way. So as I said, I am a musician of sorts, though I do stress the of sorts, because I'm really not very good at any of those things. So I'm not likely to be playing Madison Square Garden anytime soon or a theater near you. But you do play gigs with a little band, right? I do. I play in a couple of bands, but it's very low-key. So it's basically a bunch of old people playing at the M-Rock Stars, Alison. But it's good stuff. Gives me out of trouble.
[40:56]It keeps you in the bars. If you're interested in knowing which of those instruments I found the hardest, to my great surprise, it's drumming. And I started off coming at that from the approach of, well, how hard can it be? It's just banging stuff, right? But then you realize there's a heck of a lot more to it now. So I have a new appreciation of drummers who are typically the kind of laughingstock of most bands, but there's a lot more to it. Dave Hamilton of the Mac Geek Ab is a drummer, and he was just talking about it on the Mac Geek Ab about how you have to be looking at each of your limbs as separate things that you're talking to at all times, that your right hand and your left hand are doing different things in different rhythm with your feet. And it's a very mind thing to play the drums, which I'd never thought about before. And he also talked about now he sings sometimes when he's doing it. And he had to teach his brain that his voice was a fifth limb. So he had to put it in that category to get it to coordinate and be able to sing and play drums at the same time. I'd never thought about that.
[42:01]Well, firstly, he's absolutely correct. uh one of the first things you learn as a drummer is what they like to call limb independence i.e as you say you know your right hand doing one thing while your left foot's doing another and and all combinations of those in between and singing while you're drumming i i am so far away from that i can't even sing and play guitar and that that is the easiest thing of the whole lot um but anyway by the way i i will will say that uh as you get older uh like myself uh drumming is one of the very best things you can do to fight off age-related decline, both physically and mentally. It's a heck of a workout, and it's great fun. So there you go. Yeah, I've seen Dave after playing a gig, and it's like he's just drenched in sweat. That is a great workout.
[42:48]Dr. Mary Ann Gary has been on here, and she has talked about memory, and one of the things you can do is find something that's difficult for you to do and master it. And that's one of the things that can help with age-related dementia onset, or at least they've found correlations to that. They can't prove it's a causation yet. But one of the things she specifically described was learning to play the guitar. So I imagine the drums would be even harder. But let's turn into starting to talk about the tech here. Where did we start? Okay, so we should probably start the story in the early 20th century. Probably the 1920s is a good place to go. And I'm going to start off focusing around the guitar because most of the evolution in music technology is guitar-focused.
[43:36]So taking ourselves way back, of course, stringed instruments are nothing new. Stringed instruments have been around for hundreds, it's not thousands of years. So the thing which today we call a guitar evolved from mandolins and lutes and all those kind of things we had many, many centuries ago. And they all had one thing in common, which is you plucked a string. And it created a sound. Now, the first challenge is, how do you make that sound audible to an audience? And all of those traditional springed instruments had the concept of using resonance for amplification. So they had hollow bodies. And so the sound of the plucked string would resonate and amplify and produce some kind of volume. And that served us very, very well for hundreds and hundreds of years but when we got to and i'll say probably the 1920s is when this became an issue this was a year of jazz bands and swing bands and all of those kind of things and all of a sudden the poor guitarist is now competing with instruments like horn sections and drums and he gets drowned out so the the challenge which then had oh was she indeed so um the The challenge which had to be addressed was really one of volume.
[45:00]So amplification and things like microphones existed back in the 20s, but we had to find a way to apply those to the guitar in order to raise their volume to a level. Did they start by just shoving a microphone in front of the guitar? That was the initial approach. But you have to remember, back in the 1920s, microphones were pretty damn big things. So you couldn't just shove it in a guitar. Or in front of i was thinking of certainly put it in front of a guitar the problem there is that guitarists we tend to move around a lot and as the guitar moves around and moves in and out away from the microphone it can affect the volume you're getting so you get inconsistent kind of volumes so i think you know where i'm going next which is that uh one of the first, important um factors in the evolution of music tech was the development of the the magnetic pickup, which was a little device which sits underneath the strings on a guitar and converts the vibration of those strings into a signal which you can then amplify. And that's how you get volume out of guitar.
[46:08]Is that like a voice coil actuator, essentially? Like a voice coil, like in a speaker? It probably is. You probably know better than I do. I don't have your physics background. But yeah, it's a magnetic coil. Yes, indeed. Yes. Okay, so it's got a coil, a wire, and a magnet. It's moving in and out, and that's creating an electrical signal from the... So like a reverse speaker, if you will. I guess so, yes. That's true. And then turn that into an electrical signal. Amplification is already a known quantity. So it's just feeding a signal into an amplification system. And then you could finally hear the guitarist. Okay. So this was back sort of late 20s, early 30s.
[46:47]And after that, the electric guitars as such started to evolve. One of the first things is that people realized, well, we don't need this hollow body anymore. So the purpose of the hollow body on a guitar, as I said, is resonance. It's a large echo chamber. but when you're getting the volume through the pickup you no longer need the hollow body and in fact the hollow body just caused issues it caused kind of feedback issues with with the pickups so we saw the development of what became termed solid body electric guitars so now fast forward to the late 40s early 50s and this is when you have this explosion of solid body electric guitars and the creation of guitar brands, which you will still find today. So one thing I find fascinating is the three biggest selling electric guitars today, here in the year 2025, are the Fender Telecaster, the Fender Stratocaster, and the Gibson Les Paul. Each of those three guitars was created in the early to mid-1950s and has barely changed in all that time. there has been no evolution in the in the guitar itself huh yeah.
[48:03]But the technology that it uses, that it goes into, is where things have really changed? Indeed. The evolution is really what happens after you pluck the string. The plucking of the string really hasn't changed in 60, 70 years plus. And there's not really much else you can do, right? You string six guitars and an guitar and you hit them with a pick. But what happens next is where life gets interesting.
[48:29]Okay, so is that where we're going to go into changes in amplifiers? Oh, no. This is where we get something even better than that, first of all. This is where we, when music, when guitars became electric, people then started figuring out, well, now I can do stuff to manipulate the sound. And probably, and by the way, you have to remember that the evolution of the solid body guitars also kick-started the thing which today we call rock music.
[48:58]So rock music as an art form didn't really exist before the electric guitar in the early 1950s. So along comes the Bill Haley's of this world and the Elvis Presley's of this world. And it's all based about electric guitars. These weren't acoustic guitar people. They were using electric guitars. And the more forward thinking of these new rock stars figured out, okay, let's play around with the sound. And the first thing they figured out was typically something called overdrive. They figured out you're plugging your guitar into an amp. If you push the amp beyond the limits of its normal operating margin, the sound breaks up. And it breaks up in a really delightful way, a really distorted kind of gnarly way, which gives a kind of edge to your music. And so overdriven distorted music was one of the first guitar effects in the late 50s early 1960s and that was just the start of things you know coming from an ear that uh doesn't well the brain that doesn't enjoy music but an ear that prefers acoustic guitar this description really resonates if you can take that pun so they're doing it on purpose they're overdriving it on purpose because it makes a crazy sound.
[50:13]Yeah, yeah, for sure. Yeah. And that really was the start of 60s rock and roll. But there were many other guitar effects which came on the back of that. So things like reverb already kind of pretty commonly understood. Reverb is essentially emulating the sound of the venue you're playing. So, for example, you may want to recreate the sound of playing in a cathedral or playing in a small club or playing in a studio, those kind of things. You've got effects coming around such as really crazy modulation effects like flanges and phases and weird kind of space, sci-fi kind of sounds coming out.
[50:54]Jimi Hendrix famously made use of the wah pedal, so an actual foot pedal, which lets you kind of modulate the sound in tune as your foot moves. To create crazy, crazy effects. And the 1960s was really the era of when guitar effects really came into the room. And what you started finding was that the big-name guitarists of the 60s, and you think of the Hendrixes and the Clapton's and the Jimmy Page's and the Pete Townsend's, they all started creating their signature sounds. And their signature sounds was really based on a combination of a certain kind of guitar, a certain kind of amplifier, possibly a certain kind of speaker system. But certainly a certain combination of guitar effects. And these signature sounds are really quite well-kept secrets. They were closely guarded secrets. So you couldn't just walk out and say, oh, I want to sound like Jimi Hendrix today. It's not that easy. We'll come on to how that is a lot easier here in the 21st century, but back then, this was the secret source of rock music. Back when you were talking about the crazy sounds that they were just really fooling around with it, making wild and crazy sounds I have an album, I only own 12 albums one of them is called In Sounds From Way Out and I've got to figure out a way to record a little clip of that for you because I think you'll appreciate how ridiculous it is from back then.
[52:18]Okay. So probably the next big evolution in amplification was the move from classic vacuum tubes to solid-state amps. So the amps of the 1950s and 1960s were based on what we hear in the UK call valves, what you in the US call tubes, those big glass monsters.
[52:44]And they make a very warm sound, but they have challenges in a music, particularly a touring environment. They're heavy. Tube amps are heavy. They're expensive. And they're very, very fragile. So this was before the invention of the transistor. It is indeed. It is indeed, yes. So given all these shortcomings of tube-based amps, throwing them in the back of a transit van to go to your next gig is just not a good idea. To shed a glass. It wouldn't survive the journey.
[53:21]So guitar roadies would be packing replacement tubes in their bag for these eventualities. But as you say, the transistor came along, revolutionized all kinds of electronics. But just like many other forms of technology, guitar amps went solid state. They went transistor based and resolved at a stroke all those issues of weight, of expense and fragility. Now, it has to be said that. Did they make any change to what you could do with it? Or was it simply the portability and reliability of the equipment? Not really the the changes come later on when we when we get in when everything goes digital so save that thought for now we'll get back to that but it is it is important to say that uh tube amps have not completely gone out the window you can go to your local guitar store now and still buy tube amps so you may ask well why given all the advantages of transition and really it's probably the analogy would be the kind of people who buy vinyl records versus CDs.
[54:34]So the purists will say the tube amps still have a certain warmth and character about them, which even the best transistor amps have never really been able to fully replicate. I think most of us these days with a solid state, we use transistor-based amps. But no, the purists may still prefer the old tube amps. I am double-checking Steve to see if he's making this up, and he is not. I just went to Guitar Center. You can buy a Fender Blues Junior lacquered tweed tube guitar combo amp for $800 today, and it looks like it's from the 1950s. And that, by the way, I think that I'm going to guess that amp was introduced in the late 50s. It's a classic. As you'll see when we get onto the world of modeling amps, These days, you can replicate an amp like that by way of a modern digital amp, but hold that thought again for now. But how fun is that? I love that that still exists. Indeed. All right.
[55:33]In parallel with all these things changing the world of live music, there's also a revolution going on about the same time in the recording studio. So a classic recording studios use tape okay so you know you you stick a microphone or you take a feed directly from your arm and you feed it onto tape and uh you'd have um things like eight track studio 16 track studios these were the number of individual tracks you could lay down on a piece of tape um and by definition it's limited and tape-based recording has got many many limitations. It's a very labor-intensive process. Editing is very tricky. You can't just go back and change small parts of things you may have recorded. You're limiting in terms of the number of tracks.
[56:28]But round about, I'm going to guess, the 80s saw the evolution and the introduction of the first door, the digital audio workstations. Now, this is something you'll be familiar with recording podcasts.
[56:44]Yep. So in the recording studio, it's a close sister to the kind of spoken word tools you would use. But a tool called Pro Tools emerged in the 1980s and pretty quickly became a de facto standard for this new style of digital recording. Non-destructive editing. It's very, very easy to just go back, make changes. Uh the something called comping evolved comping is essentially picking the best bits of recording session and and editing it all together into a single final recorded uh version um you're unlimited on tracks because everything's digital it's just hard drive space you're no longer limited eight or 16 tracks um and you get a lot of editing tools you can do things like pitch correction so for example if your singer is just slightly off key well there are now tools to to correct that um if your drummer is slightly off the beat uh you can correct that there's a wonderful technical quantization which will very nicely align everything the drummer does to the the perfect beat um and this just completely revolutionized the world of studio recording.
[58:03]Many many years ago Shai Yamini the performer was on the show and he talked about the tools that he uses in audio editing and the things that he could do with this with changing pitch and beat and all that and he swears that if I make a recording of me singing, he could make it sound good and I've never gotten the nerve up to send him a recording but I still have that in the back of my head that someday I'm going to actually do that.
[58:29]Do it, do it. And by the way, of course, the evolution after the studio-based tools like Pro Tools was into the laptop-based tools. So, you know, I know that you have a slant towards the Mac user base. So people are probably familiar with GarageBand or GarageBand, as I'm sure you call it in America.
[58:48]Um this was to use a a a a rather sudy term the democratization of recording so all of a sudden you didn't need access to an expensive recording studio to to go make a record you can record it in your bedroom and we'll get more into this later on because everything has changed but doors have had a fundamental impact on the creation and the distribution in fact so would you say that that uh daws are what created the garage bands so apps like garage band created garage bands well the the original garage bands were of course you know bands from the 60s who literally threw stuff together in their parents garage and started paying and recording there so i think the term predated the the apple product but anyway but i think it was probably a tip of they had an homage to that right an homage indeed indeed indeed but but this was really the the tip of the iceberg of of the analog to digital revolution because in addition to to recording tools um amps went digital and this is where things really do get do get exciting so let me stop you for a second there so when we talk about we talk a lot about analog to digital digital to analog 80 uh you know dax digital audio converters things like that but i haven't had anybody on the show who would explain to everybody listening what what do you mean by analog.
[1:00:17]Versus digital what does that really mean okay well please use physics now come on you remember i'm a computer science major not a physics major.
[1:00:31]Well let me give you an example um so, In the probably early 90s to mid-90s, the first digital amps started coming out, and these were amplifiers which we've said already, you know, they've already gone this evolution of tubes to solid state. Now we start putting CPUs into amps. Now we start putting digital signal processors, DSPs, into amps. Now, a DSP is an interesting technology. But DSP takes an analog signal, such as you plucking a task string, converts it into digital, does some processing, plays around with it, manipulates it in some way, and then converts it back to analog, i.e. a sound, which can come out of a speaker. So when it's digital, it's just bits. It's bits. It's completely bits and bytes. Yes. Not a continuous wave of something, but bits that it can manipulate. Okay, good. It's in a digital form. You can do all kinds of stuff with it. And this is where the fun stuff I alluded to earlier really comes into its own. So I've got my guitar amp here.
[1:01:43]It's packed with computational audio and DSPs. Would it be nice if that amp could replicate that Fender Twin Reverb you just saw in Guitar Center or any other? You pick your favorite Marshall amp or you pick your favorite amp from ever, and at the flick of a switch, that amp can replicate the sound you'd get from a classic tube amp. This is the promise, and in fact the reality, that these modeling amps started delivering in the mid-1990s. And not just that, you can also add in effects. So all the effects I mentioned earlier, which previously the incarnation was a physical pedal, you stomped on your foot, well, guess what? Now it's just a switch and a digital modeling amp. You probably have your iPhone app, and you can just create whatever selection of whatever amp you choose and whatever selection of effects you want to add before and after it.
[1:02:37]And there is your sound. And I mentioned earlier on about the secret sauce of these 60s and 70s rock musicians, the signature sound. Well, guess what? Now you can easily replicate those sounds with the touch of a button. And by the way, of course, like most areas, AI is invading this. I've got an app on my phone that goes in my app. And if I say, hey, I want to sound like Jimi Hendrix on Hey Joe, bam, guess what? But AI will pick up exactly the perfect combination of amp and effects pedals to give me that sound.
[1:03:13]Wow. Yep, indeed. This is a lot of fun. So if it's on your phone, how is that ending up turning into sound coming out to my ear? There's a little gap between an iPhone app and that. Is it telling you dial this on your guitar and dial these buttons in the interface to your amp? The iPhone app is simply programming the amp. So there's a term, which is the signal chain. The signal chain for a guitarist is the combination of devices that go between the guitar on one end and the speaker on the other. It's that combination of amplifiers and effects pedals. So when you get into the digital frame, you have a virtual signal chain. You can design this virtual signal chain on your iPhone, press a button, download it into your amp, which then triggers the DSPs appropriately and magically out comes the sound you want. So this is pretty crazy. We went from, what, acoustic guitars and then tube amps to this in 100 years? That's astonishing.
[1:04:26]Indeed. And by the way, one of the other really fun aspects of this is I talked earlier about this, the democratization of music. Now that we can use these digital modeling amps to create these virtual signal chains and emulate the great guitarists of the past, this has also given rise to a community of tone sharing. So if you create a fantastic sound and you'd like to share with others, well, guess what? You can pop it up into the cloud, into these sharing sites, and others can download your tones and benefit from your creativity and design work. So a tone would be like what? What would a tone be? A tone is this virtual signal chain, a combination of an amp and effects, and maybe possibly also a speaker stack to go with it. But the sound that comes out of a guitar is essentially a tone. Okay, okay, the whole kit and caboodle here, this entire change to the analog signal to something completely different.
[1:05:28]Okay. All right. Now, in your little notes you gave me, you said you've got a little miscellaneous category here because we're caught up to the present so far. Yeah, I've been conscious. I focused just on guitars today, which is an interesting, it's probably the most valuable angle to take. But of course, when things start going digital, then other instruments are affected as well. One of the first obvious ones is the whole world of synthesizers and sampling. So, for example, the earliest electronic keyboards appear in the late 60s, the 70s, things like the modes and the ARPs. And all of a sudden, you get the emergence of electronic music and a whole different feel. No longer are you limited by the sound of conventional instruments. Sampling comes along, lets you essentially create mini recordings of anything from spoken word to sound effects. and you can pop them into your music. And all of a sudden, you have the evolution of music forms like hip-hop and others, which rely heavily on sampling.
[1:06:36]You get things like drum machines. So, hey, I hate to say this as a drummer, but drummers are relatively easily replaced by technology. Essentially, all we do is sit there banging things, as I said, and creating a beat. And that's pretty easy to replicate with digital technology. So drum machines come around so uh but they didn't get replaced when drum machines showed up right i'm sorry say again they didn't get uh drummers didn't get replaced when uh electronic drums came around their drummers are still there not entirely but in many many forms of music you won't find a lot of live drummers anymore uh one of the other challenges of recording uh drums in particular is it's really expensive. So for example, let's contrast recording drums in a studio with recording, say, a guitar or a keyboard. With a guitar, you run a cable or possibly a mic directly from the amp into your mixing desk. If you want to record drums, you are possibly rigging up as many as 12 to 15 microphones all around the drum kit to capture it authentically. And I'll Think about that. Yeah, so I'll give you an example.
[1:07:51]I have a drum teacher who is a real purist. He only has acoustic drums, the traditional drums you would have seen on stage. And he eschews all kinds of electronic technology. Now, I've got electronic drums, e-drums at home, which have a number of benefits, primarily for my neighbors, because they're essentially silent. I can bang away to my heart's content, listen to things through my headphone, and it sounds authentic. Um all my neighbors can hear is is the the softest patter of stick on drum pad um but from a recording point of view electronic drums have huge advantages so i was chatting with my my my purist drum teacher the day he told me he uses 17 microphones to record himself when he's drumming well guess what i do i run a usb cable out the back i'm sorry i thought you meant automatic uh replication of drums not the ability to record uh an electronic drums i misunderstood what you meant i i kind of veered from one to the other but my my point is that given the traditional drum kits are so problematic it's just another reason why a lot of modern music creators will use a drum machine rather than doesn't that feel completely different to be tapping on a little rubber mat than actually hitting a drum, Yes, but you'll get into one of the themes I'm getting into later, which is that modern music is a little bit different.
[1:09:17]You don't even get a workout, probably. You probably don't even get a workout, do you? Probably not. You probably do not. No, indeed, indeed.
[1:09:26]By the way, just before we leave this, I just want to also just mention the effect of technology on vocals. So we talked already about autotune, pitch correction. Okay, so you have the out-of-key singer. You can just make tiny corrections and put him back into tune or her back into tune. But very quickly, pitch correction and auto-tune move from a technical tool to a creative tool. So I know you're not a music fan, but probably the landmark in this regard is a song which I'm sure many of your listeners will know by Cher, a song called Believe in the late 90s, which really took auto-tune to a whole new level and unleashed the floodgates of auto-tune creativity and crazy, crazy vocal effects. So these days, you can buy a $200 vocal processing unit, which sits between your microphone and the PA system. And you can do all kinds of crazy things with your vocals. You can add harmonies. You can double up vocals.
[1:10:34]You can do formant shifting. Now, formant shifting is essentially changing the tone of a vocal. So for example, a simple example, you could make a male voice sound female or vice versa. All kinds of crazy things you can do with vocals as well. Technology gets everywhere. I do know that song, by the way. Oh, good.
[1:10:56]I do you a huge disservice. I apologize. No, that's okay. It's a good assumption. I do remember somebody telling me the classic case of being able to auto-tune a voice was David Hasselhoff, apparently cannot sing at all, and yet he's got an album where it sounds like he can.
[1:11:15]I want to say that might have been something that Shai told us about, but that was like the classic example of somebody who can't carry a tune. Well, he's certainly a very popular recording star. For example, in Germany, I know he's very big, and I wouldn't be surprised if he's had some help in technology, for sure, indeed. By the way, I hate to break any of your recording secrets, Alison, but maybe you use vocal processing yourself. It's nothing to be a shame, dog. Yeah, a little bit. Not too much, but yeah, definitely a little bit of sweetening of the audio, as they say. So what about, you had a note about MIDI. Was there anything about that? I'm not even sure I understand what MIDI is. I know it's an app on my Mac. MIDI is essentially a protocol which lets you program musical instruments in its very simplest form.
[1:12:07]It's code. It's computer code. A good example would be, let's say, for example, you create a melody on a MIDI-enabled keyboard, but you'd like to see what that sounds like played by an oboe. Well, MIDI can just convert that into an oboe or one of a million different orchestral sounds. That's just an example of MIDI in action. So MIDI is a technology which is big in the world of keyboards, drums as well to an extent. So, for example, I can record myself playing drums, output my recording via MIDI into my DAW, and then see what my electronic drum kit would sound on a classic 1960s drum kit, or maybe an electronic Roland drum kit or something like that. So, it's another way of manipulating your sound using digital technology. I can sure see how the purists have a problem with this, and yet you can't deny that this allows a lot of creativity that you couldn't have otherwise.
[1:13:14]Indeed so. But this is a segue into, if I may, in the best Steve Jobs tradition, please allow me one more thing, Alison. Okay. May I make a bold statement? Yes. Okay. Modern music is rubbish. Now, I don't care to- Hey, old man. I don't kind of sound too much like my parents or or too much like grandpa simpson shouting.
[1:13:49]But music changed around the start of the 21st century for a few reasons which i'll mention in a second and to my mind there hasn't been a heck of a lot of good music in the last 25 years, and and and i'll tell you why i think that is i i think there is some reason to believe this is true um there's a couple of reasons before i do i actually want to quote a source and maybe refer your listeners to to a very popular youtuber called rick beato uh b-e-a-t-o rick beato is a musician and a music producer and a songwriter um and a music journalist um he's got a fantastic youtube channel and um this was a theme he took up just a couple of months ago and i found it absolutely echoing all of my own thoughts so i just want to share with you some of the things he said which i'm completely in tune with and that there's there's a couple of big reasons why modern music is rubbish the first thing is it's too easy to make so i talked earlier about the democratization of music recording via laptop-based doors versus the recording studio. That's great. Anybody can create a song. That's the good news. The bad news is anybody can create a song, and there's an awful lot of rubbish out there. And it's just way too easy.
[1:15:16]Around about the same time, the music companies stopped signing bands and focused more on solo artists because they were cheaper the whole economics of the music business got turned upside down uh all of them why would you spend money signing uh an expensive rock band and all the expense that goes to recording them when you can find some talented guy sitting in his bedroom with garage band recording the next big hit uh the economics just favor the the solo artists these days and bands i'm afraid have gone out of fashion i might be playing right into your hand here because you are calling out that it's better to more economical to have a solo artist, but you've basically just dissed every Swifty on the planet by saying that modern music is, is rubbish. I, I, I, and I was stunned by that. But they did. Billy Eilish. I mean, come on.
[1:16:14]They're talented artists, but they get a heck of a lot of help from technology. And by the way, I think you know exactly where this is going. But does that make it rubbish? But does that make it rubbish just because they got a lot of help from technology? To my mind, yes. It really does. It really does. We probably don't have enough time to explore why I feel Taylor Swift and Billie Eilish are bland in the extreme. I don't find them exciting at all. And by the way, they get an awful lot of help, those artists and others, from, and you guess where this is going, AI. AI songwriting tools are now very, very common. And yes, the Taylor Swifts and the Billy Eilichs of this world make extensive use of AI songwriting tools. So what? That doesn't make the music rubbish. Oh, no, no. That is a brave position to take, Alison. I have to tell you. No, I'm just challenging the supposition that because something was generated with AI, it is therefore rubbish. Because it's a solo artist, it's therefore rubbish. Because they had modern technology, it's therefore rubbish. That's not a direct line. There's not a logical line between those two. It might be rubbish, but drawing that direct causation, I don't know about that. Well, okay. So you just pull the rug out beneath every music creator in the world then, because if they can be replaced by machine, what is the point of making your own music?
[1:17:41]I didn't say that. I didn't say they can be replaced by machine. I mean, to me, you could say the exact same. Everything you said could be radio recording people to podcasting, right? Because it's been democratized and any joker in her bedroom, whatever, can create a podcast. So there's a lot to choose from, but there are still great podcasts. It doesn't mean that it's rubbish because I've got the technological tools to allow me to do this and I'm not in a fancy recording studio. And it sounds like you're saying the same thing about music.
[1:18:16]But you are coming up with the original content for your pods. Now, songwriting is different. So let's break this down to the basic musical theory. Music is 12 notes. Every song is only different from another song, with those 12 notes being arranged in a different fashion. Except for Indian music. Well, okay. I'm talking about Western music. Yes, there are different forms of world music. Yes, you are correct. They have lots more notes. We're in the world of Taylor Swift and Billie Eilish here. So we work in the world of 12 notes and chromatic scales. So this is a perfect topic for AI. So all the artists we talked about are making extensive use of AI. Now, right now, AI is not writing hit records exclusively, but all of these big songwriters will use AI tools. I think the term is ideation. So they'll create an AI-based song, and they'll think, uh ah i like that chord progression over there i'll use that or i liked that lyric over there and they steal bits so the the top songwriting teams of today are ai assisted they are hybrid ai human creations now what's interesting is what happens when ai actually creates a hit record from scratch.
[1:19:37]Now we get into a legal minefield apartment in the house because who gets paid? Stay off of that train because that's a whole different topic. And I'm really going to challenge it here. What is to say that that AI-created song, if it's a hit record, is rubbish? That's all I'm getting to is you may find it offensive that it's not created by a human, but what makes it rubbish? Rubbish is in the eye of the beholder. Yeah, I think that you just like classic rock and that you don't like modern music. I don't think it has anything to do with how it's created. Okay, well, let's take a step backwards and think about this. How do AI tools create music? Okay, so they have a large language model, right? Where has that large language model come from? It's been populated by decades of recorded music. Not populated. AI music is derived from decades of previously recorded music. Correct. Trained on. It's not original.
[1:20:39]So no coding is original because we're all using GitHub Copilot to write our code. Okay. Yeah. All right. Well, we can probably agree to disagree on that. But let me just, I just want to give you a statement, which I found very interesting. So I just want to read this paragraph to you. Creative AI tools can be seen as sophisticated plagiarism software as they do not produce genuine original content, but rather emulate and modify existing works by artists subtly enough to circumvent copyright laws. You know what's interesting about that? That was written by ChatGPT. Sure.
[1:21:20]I just, I don't know. I'm thinking about the, who was it that Ed Sheeran had the, somebody was saying that he stole their song because it sounded a lot like something they'd written. I believe artists have been stealing each other's ideas.
[1:21:37]Painters do this. Everybody is in a certain genre at the same time. Everybody paints like so-and-so at a certain era in time. Same thing happens with music. um i think that's i think that's a natural progression but but i'm just going to keep contending that that doesn't make what you hear rubbish because rubbish is in the ear of the beholder people okay okay feel good listening to taylor swift right by the way ed sheeran was taken to court and there's many other examples of that beforehand but uh yes uh he won in that one the the interesting thing is going to be uh copyright law has to evolve in this area because Because as and when AI does, from scratch, create a billion-selling hit record, who makes the money? Is it the guy who created the song? Is it the company that created the AI tools? Or is it the music companies whose back catalogs were used to train the language models?
[1:22:33]It's a legal minefield. I understand it's a legal minefield. I will never argue with you on that point. That is well beyond the scope of this. But I'm going to throw one more way to argue with you about your saying this is rubbish. I'm going to pull back the curtain a little bit. When Steve and I were talking about having this conversation, he had already had this conversation with me, so I knew he was capable of reproducing it. But he wanted a good outline of what he was going to talk about. So what did he do? He went to ChatGPT and had to create an outline for him. And then he picked and chose what pieces of that he wanted to work with, exactly like you just said Taylor Swift or the other artists are doing with their music.
[1:23:11]I'm a big fan of AI, just not for creating music.
[1:23:15]By the way, just to finish on the theme of modern music is rubbish, I said there was two reasons. The first is it's too easy to make. The second is it's too easy to consume. So why in the decades back? I know when I was a teenager, I had my paper round. I would save my money up to buy that album I wanted. And I would go home and I'd stick it in my record player in my bedroom and listen to it over and over and over again. And I paid, I don't know, five, six pounds an album. Now we're probably talking three times that. But of course, music consumption has completely evolved. We went from, well, all the way back from vinyl to 8-tracks to cassettes to CDs to digital downloads. And now, of course, we're in the world of streaming and Spotify. And not only is it very easy to create music, it's very easy to get your music distributed. So I can sit in my bedroom with Gary's band and create my song, and I can upload it to Spotify.
[1:24:23]Um a heck of a lot of stuff gets uploaded to spotify there's not a lot of quality control on the stuff in spotify these days so um music has essentially become valueless and it's it's to my mind it's it's it's hard to find the good stuff uh for ten dollars a month i think we've kind of lost a little bit of um judgment shall we say it's it's a it's a tricky area i can see what you're saying it's it's um it's just become diluted maybe is that the word i'll give give you an example by the way uh last year some friends of mine they have a band um they wrote and recorded an entire album's worth of songs 11 original songs and they they uploaded them all to spotify and apple music and just before christmas they had their 10 000th stream And by the way, a stream in Spotify, Apple Music Terms, is when it's played for at least 30 seconds.
[1:25:25]So 10,000 people thought enough of their music to play it. Now, okay, that's not your Taylor Swift. Or 10,000 plays might be 5,000 people twice. It could possibly be that indeed, or any combination of the above. Now, that's not Taylor Swift. Uh, but I tell you, if I wrote something that 10,000 people listen to, I'd feel pretty happy about it. But do you know what their commission check, their royalty check was from Spotify for 10,000 plays?
[1:25:56]$40 between four of them. Wow. So the whole economics of music is upside down. Well, I've heard that artists in general have always made more money touring than they make from their albums. That's always been true, right? It's certainly true today. It may or may not have been true back in the heydays of the 60s, 70s, 80s, but it's certainly true today. The money's in tour and the merch, the merchandise. There you go. All right. Well, I want to buy a t-shirt for your band. That's what I want.
[1:26:34]Well we have a name which we'll be pleased to know was created by ai um called the titanium saints the titanium saints yeah i was hoping you were going to say mouse rat but uh that's a whole other story all right steve um i didn't ask you this ahead of time do you have any sort of social media presence where people could uh write to you and tell you that you're brilliant and that everything i said was all wet uh not really i guess i'm traditional but you can always email me at steve at mallard.com, which is easy enough to remember. All right. Well, I'm sure glad we reconnected and I had a lot of fun doing this with you, Steve. It was a blast. Thank you for inviting me, Alison.
[1:27:11]Well, now I can say I actually did a chit-chat across the pond all about music. Bet you never saw that one coming. I had a blast talking to Steve. That was really fun. But that is going to wind us up for this week. Did you know you can email me at alison at podfeet.com anytime you like? If you have a question or suggestion, just send it on over. Remember, everything good starts with podfeet.com. You can follow me on Mastodon, podfeet.com slash Mastodon. If you want to listen to the podcast on YouTube, you can go to podfeet.com slash YouTube. If you want to join the conversation, you can join our Slack community at podfeet.com slash Slack. You can talk to me and all of the other lovely Nocella castaways there. You can support the show at podfeet.com slash Patreon, like Zoe Brings Bacon did this week, or with a one-time donation at podfeet.com slash donate. And there you can use Apple Pay or any credit card, or you can use PayPal at podfee.com slash PayPal. And if you want to join in the fun of the live show, head on over to podfee.com slash live on Sunday nights at 5 p.m. Pacific time. Enjoy the friendly and enthusiastic.
[1:28:09]Music.