Back to Basics: Practical Antennas and the Beauty of Simplicity
When I first started in electronics back in the mid-’80s, one of the first things I learned was that good design doesn’t always mean complex design. Nowhere is that truer than with antennas. Chapter 8B of the RAC Basic Certificate Course reminded me of something many of us tend to forget: the simplest antennas often work the best.
Riding the Waves: Reflections on Radio Propagation
This week’s class on radio-wave propagation took me right back to my roots. It’s hard not to smile when I see diagrams of skywaves, ground waves, and skip zones — things I first learned about nearly forty years ago at Sheridan College, when we were still using analog oscilloscopes and scientific calculators. Back then, I was an electronics engineering technology student just getting ready to start my career, and the idea that invisible waves could circle the globe felt almost magical.
Now, as I sit in class once again — decades later, after a long career in electronics and electrical control system design — I see those same principles through a different lens. What once seemed like mysterious phenomena now feel like old friends whose habits I know well. The theory hasn’t changed, even though the tools and applications have evolved beyond what I could have imagined in 1985.
From Spark-Gap to Skywave
The story of radio propagation is as much about curiosity as it is about science. When Guglielmo Marconi proved in 1901 that a wireless signal could cross the Atlantic, he didn’t know why it worked — only that it did. He thought the waves were following the Earth’s curve, but in reality, they were bouncing off the ionosphere — a discovery that wouldn’t be fully understood for decades.
That experiment, conducted from Signal Hill in St. John’s, Newfoundland, still captures my imagination. It’s a reminder of how experimentation often leads the theory, and how the physics that govern our communication systems today are the same ones that guided those first transatlantic dots and dashes of Morse code.
The Ionosphere: A Living Mirror
Of all the layers of the atmosphere, the ionosphere remains the most fascinating. The D, E, and F layers — invisible, dynamic, and ever-changing — are what make global communication possible.
The Ionosphere (Thermosphere) is part of Earth’s Atmosphere. The Thermosphere is characterized by very high temperatures ranging from 550 to over 1300 degrees Kelvin due to the solar EUV. [1] image: qsl.netIonospheric regions illustration [1] image: qsl.net
During the day, the D layer absorbs low-frequency signals, muting AM stations that roar back to life at sunset. The higher F layers, split into F1 and F2 in daylight, refract HF signals over thousands of kilometres, bending them back toward Earth like a mirror made of charged particles. [1], [3]
Even after years of working with EMI and high-frequency noise in control systems, it still amazes me how the same physics in a factory floor’s PLC cabinet also govern how a signal from halfway around the world reaches a simple dipole antenna in my backyard.
Of Sunspots, Solar Flux, and the Unseen Weather Above
As engineers, we tend to think of “weather” as something that affects reliability through temperature or humidity. But this week’s class reminded me that space weather plays an equally critical role. Solar flares, coronal mass ejections, and sunspots — the restless activity of our nearest star — shape the ionosphere daily [2], [5].
The Dominion Radio Astrophysical Observatory in Penticton, BC, has been measuring the Sun’s radio emissions at 2800 MHz since the 1940s, producing the Solar Flux Index we still rely on. Seeing that connection between a Canadian observatory, solar physics, and real-world radio performance renewed my appreciation for how deeply our communications depend on phenomena far beyond our control.
Skip Zones, Fading, and the Fragile Nature of Connection
In my work over the years, I’ve seen signal dropouts caused by everything from ground loops to induction noise, but ionospheric fading has a kind of poetry to it. The way signals strengthen and vanish as the ionosphere shifts with time and sunlight reminds me that all communication — whether through copper, fibre, or free space — is ultimately about timing and conditions.
Overview of HF Propagation Modes image: qsl.net [3]
Skip zones, backscatter, and multipath effects are more than textbook curiosities; they’re metaphors for the real-world challenge of designing systems that perform reliably despite an unpredictable environment. Our ability to predict, compensate for, and even exploit those effects is a testament to a century of accumulated engineering wisdom.
NVIS and the Modern Relevance of Propagation
Near Vertical Incidence Skywave (NVIS) propagation especially caught my attention. It’s used today for short-range HF communication, where terrain or disasters might block line-of-sight signals. I couldn’t help but think how that same principle could apply to robust emergency communications or even industrial networks in remote regions.
NVIS signals are propagated at a high elevation (greater than 60°). They are reflected down from the ionosphere (principally the F2 layer) over an area of a few hundred kilometres from the source. [4]
Using trigonometric calculations, we can estimate the NVIS coverage to be about 350 kilometres. Because the diagram represents a two-dimensional view, it’s essential to understand that this range extends equally in all directions from the transmitting station. In other words, if we imagine a circle centred on the transmitter with a radius of 350 kilometres, any receiver positioned within that circle should be able to detect the NVIS signal.
Despite all our modern connectivity, there’s still a place for understanding how nature itself moves information from one point to another.
Closing Thoughts
For me, studying propagation again isn’t about relearning the formulas — it’s about rediscovering the elegance behind them. The same electromagnetic principles that may have carried Marconi’s letter “S” across the Atlantic in 1901 still underpin the wireless links in a factory, the telemetry in a wind farm, or the Wi-Fi in our homes.
After four decades in the field, it’s humbling to be reminded that no matter how advanced our technology becomes, we’re still riding the same waves — reflections from the same sky, governed by the same physics, and inspired by the same human drive to connect across distance.
References
[1] D. Tal, “Region vs. Layer: Earth’s Atmosphere and Ionosphere”. Accessed: Oct. 10, 2025. [Online]. Available: https://www.qsl.net/4x4xm/Region-vs-Layer-Earth’s-Atmosphere-and-Ionosphere.htm
[4] John VA3KOT, ‘The NVIS Illusion’, Ham Radio Outside the Box. Accessed: Oct. 21, 2025. [Online]. Available: https://hamradiooutsidethebox.ca/2023/06/19/the-nvis-illusion/
[5] N. R. C. Government of Canada, ‘Current regional magnetic conditions’. Accessed: Oct. 21, 2025. [Online]. Available: https://www.spaceweather.gc.ca/forecast-prevision/short-court/regional/sr-1-en.php?region=ott&mapname=east_n_america
From Sparks to Software: Reflections on Waves, Wavelengths, and the Spirit of Radio
Every so often, a class reminds me that the fundamentals never really change. This week’s topic — Waves, Wavelength, Frequency, and Bands — was a trip back in time for me. As someone who trained as an electronics engineering technologist in the 1980s, this material feels like old territory. Yet, revisiting it in today’s context, with modern software-defined radios (SDRs) and digital signal processing, highlights just how enduring these basic principles are.
The Timeless Language of Waves
The lecture opened with the essentials: amplitude, wavelength, and frequency — the vocabulary of every waveform, whether it’s a sound wave in air or a radio wave racing through space. Back then, we learned these on oscilloscopes with green phosphor traces, adjusting time bases and triggering circuits to freeze those elegant sine waves in time. Today, the same principles remain, but the scopes are digital, and the waves are simulated, captured, and analyzed by software.
What hasn’t changed is the relationship between wavelength and frequency — the beautiful simplicity of ( c = f \λ ), where the speed of light ties them together. It’s still the same 300 × 106 metres per second in free space, as it was for Heinrich Hertz when he first proved Maxwell’s equations right. The numbers might scale from kilohertz to gigahertz now, but the physics hasn’t changed a bit.
A Short History of Long Waves
Thinking about waves and bands always takes me back to the early history of radio — to the days of spark-gap transmitters, longwave Morse signals, and men like Hertz, Marconi, and Fessenden. Those pioneers were dealing with kilohertz frequencies, with antennas and capacitors the size of ships, yet they were manipulating the same invisible medium we use today.
Marconi’s Antenna Towers, Glace Bay, Nova Scotia. The four 200-foot wooden latticework towers shown supported an inverted pyramid of antenna wires not visible in the photo.Marconi Antenna Array Structure, 1907
I’m pretty sure my neighbours would be unhappy with me if I built an antenna array that resembled Marconi’s Glace Bay installation!
The circuit diagram of the December 1901 Poldhu transmitter in J.A. Fleming’s handwriting.
By 1907, the Marconi station at Glace Bay, Nova Scotia, was operating on remarkably low frequencies for its time — around 70 kHz. That range was chosen for the transatlantic service linking Glace Bay with Clifden, Ireland, inaugurated in October 1907.
Earlier experiments at the site had been conducted on higher frequencies — roughly 182 kHz in 1902 and 272 kHz for some later trials — but Marconi’s engineers soon discovered that lower frequencies provided far greater reliability over long distances. By 1907, they had moved down to around 70 kHz, and eventually as low as 45 kHz.
These very long wavelengths (VLF) proved ideal for transatlantic work. The signals could travel enormous distances, reflecting between the Earth and the ionosphere, and remained stable both day and night — something that higher frequencies of the day simply couldn’t achieve. Considering the limits of early transmitters and receivers, this was an extraordinary technical accomplishment, and a milestone in the development of reliable global communication.
First Transatlantic Radio Service, October 17, 1907
In the photo above, Marconi Operator L.R. Johnstone is shown transmitting the first official messages of the commercial wireless telegraph service from Marconi Towers, near Glace Bay, Nova Scotia, to Clifden, Ireland, on October 17, 1907. Ten thousand words were exchanged between the stations on the first day of operation.
The rotary spark discharger was the heart of the spark transmitter at Marconi Towers. In the upper-right background, the three large ring-shaped loops are the three turns of the antenna transformer’s primary coil, which fed the spark’s electrical energy into the high-mounted antenna.
Historical Photo of Marconi Wireless Site and Towers in Glace Bay, Nova Scotia, Canada. See page for author, Public domain, via Wikimedia Commons.Coal-fired boilers in the power house at Marconi Towers produced steam for the main steam engine and alternator. image: Cape Breton Wireless Heritage Society
At the Marconi station in Glace Bay, Nova Scotia, six steam boilers powered dynamos that generated the 15 kV power supply, which charged a capacitor composed of 288 metal sheets, each measuring 60 feet by 20 feet, separated by approximately 6 inches. The sheets were suspended from rafters at the top of the building and hung vertically almost to ground level. This capacitor (or “condenser,” as it was originally called) occupied most of the 160-foot-long transmitter building. As a result, the building became known as the condenser building.
Main operating building of Marconi Wireless Station, ca. 1912. Collection of Port Morien Station, ca. 1912 / Collection de la station de Port Morien, vers 1912.
This extensive array of plates offered only 1.7 microfarads of capacitance, with a voltage rating of 15 kilovolts. An “air-insulated” design was selected over a more compact glass dielectric design because it was relatively trouble-free and easy to build using locally available materials. If a draft caused the plates to shift and short-circuit, the “spot-welded” plates could be knocked apart with a sledgehammer.
Marconi’s Condenser (Capacitor) at Glace Bay, Nova Scotia
One story that sticks in my mind is Weather Station Kurt, a German automatic transmitter secretly placed in Labrador during the Second World War. It operated on 3940 kHz — smack in what we now think of as the amateur 80 m band — sending weather reports back to Germany every few hours. Even in wartime, radio’s reach was both strategic and scientific, and its physics — the propagation of electromagnetic waves through atmosphere and ionosphere — governed everything.
From Resonant Circuits to Software Radios
Interaction between Capacitive and Inductive reactance and the resonance point — Slide by Al Penny VO1NO
When I first learned about LC circuits, we tuned them with physical coils and capacitors, feeling resonance through the sound in a speaker or the flicker on a meter. The “bands” we talked about then — HF, VHF, UHF — were as much about practical limitations as they were about regulation. Antenna length, wavelength, and atmospheric reflection all dictated what was possible.
Today, when I look at a modern software-defined radio (SDR), I find it astonishing to think that an entire superheterodyne receiver now fits in a USB drive. Instead of a coil and a capacitor, we use algorithms to shift and filter signals in code. But the foundation — waves, frequencies, and harmonics — is the same. Whether it’s a spark-gap transmitter, a vacuum-tube superhet, or a LimeSDR, every one of them lives by the same electromagnetic laws.
Why It Still Matters
There’s a certain poetry in knowing that a 7 MHz signal — the classic 40 m amateur band — still bounces off the ionosphere just as it did when I first listened to shortwave broadcasts as a teenager. The gear has changed; the spectrum hasn’t. Even in a world dominated by Wi-Fi, Bluetooth, and cellular links operating in gigahertz ranges, it all still comes down to waves oscillating in time and space.
So while the students around me might be seeing this material for the first time, I find myself smiling at how familiar it feels. The math is old, the physics immutable, and yet the applications keep evolving. From sparks to software, from the hiss of AM static to the crisp digital decoding of an SDR waterfall, it’s all the same story — one wave at a time.
Looking Ahead
As the course continues, I’m looking forward to diving deeper into how these timeless wave fundamentals shape the communication technologies we rely on every day. Understanding them through both the lens of experience and modern tools like SDRs bridges the gap between the analog world in which I first learned and the digital systems that define today’s engineering practice. It’s a reminder that while technology evolves, the language of physics remains beautifully consistent — and that makes every new concept feel like an old friend revisited.
From Capacitance to Resonance: Revisiting the Fundamentals
Over the past couple of classes, we’ve been exploring two interconnected concepts in electronics: capacitance and resonance. For me, these aren’t new ideas—I first studied them at Sheridan College back in the 1980s, when oscilloscopes still had CRTs and breadboards came with point-to-point wiring. Even so, it has been rewarding to revisit these foundational topics with a fresh perspective and a few more decades of practical experience under my belt.
Understanding Capacitance
Capacitance is one of those elegant concepts that forms the backbone of modern electronics. A capacitor—two conductive plates separated by an insulating dielectric—resists changes in voltage by storing and releasing energy. It’s such a simple structure, but its impact on everything from timing circuits to power supplies can’t be overstated.
Revisiting the theory reminded me just how differently capacitors behave in DC versus AC circuits. In DC, a capacitor charges up to the supply voltage and then effectively becomes an open circuit. In AC, though, the constantly changing voltage makes it look as though current flows straight through—even though electrons never actually cross the dielectric. This behaviour gives rise to capacitive reactance (XC), which decreases as frequency increases. That’s why capacitors block DC but pass high-frequency signals, making them indispensable in filters and coupling circuits.
The unit of capacitance, the farad, is far too large for most real-world applications, which is why we use microfarads, nanofarads, and picofarads instead. Plate area, spacing, and dielectric material all shape a capacitor’s behaviour—details I once learned in a classroom, but now appreciate in a far deeper way after years of working with safety systems and control circuits.
Revisiting Resonance
Next came resonance—another familiar but fascinating topic. An inductor resists changes in current through its magnetic field, while a capacitor resists changes in voltage through its electric field. Put them together and, under the right conditions, they exchange energy back and forth in a kind of electrical echo.
Resonance happens when the inductive reactance (XL) and capacitive reactance (XC) are equal and cancel each other out. At that frequency, the circuit oscillates like a perfectly tuned pendulum, trading energy between the capacitor’s electric field and the inductor’s magnetic field. That’s the essence of every tuned circuit—from early radio receivers to modern communication filters.
One example that always comes to mind when talking about resonance is the Tacoma Narrows Bridge collapse of 1940. Although it wasn’t pure electrical resonance—it was aeroelastic flutter—the underlying principle was the same: energy reinforcing itself until a system fails. It’s a dramatic reminder of how resonance, in any form, can be both powerful and destructive if it’s not managed properly.
Series and Parallel Resonance
Reviewing series and parallel RLC circuits brought back memories of long lab sessions and breadboards filled with coils and capacitors. In a series circuit, resonance minimizes impedance, leaving only the resistance to limit current. In a parallel circuit, it does the opposite—impedance rises to a maximum.
Those two behaviours form the basis of almost every practical filter: low-pass, high-pass, band-pass, or notch. Seeing the relationships between theory and application again reminded me why I fell in love with electronics in the first place. There’s something deeply satisfying about watching a sine wave sharpen or flatten on a scope exactly as the equations predict.
The Quality Factor (Q) and Real-World Radios
We also revisited the Q factor, which describes how “sharp” or selective a resonant circuit is. High-Q circuits have narrow bandwidth and greater selectivity, while low-Q circuits are broader and less discriminating.
I built my first crystal radio as a preteen, so it was fun to see it come up again. With its single tuned circuit and diode detector, the crystal set has a very low Q—but it works. With nothing more than a coil of wire, a bit of crystal (or even a sugar cube and a sewing needle, if you’re improvising), and a pair of headphones, you can literally pull voices and music out of the air. Even now, I still find that magical.
My Takeaway
Revisiting capacitance and resonance after four decades reminded me how enduring the fundamentals really are. These aren’t just abstract properties—they’re the building blocks of everything from radios to robotics. Capacitors don’t merely “store charge”; they make stable, responsive, and selective systems possible. Resonant circuits don’t just cancel reactances; they allow us to shape and control the signals that carry our modern world.
Coming back to these topics, I find myself both nostalgic and appreciative. The equations haven’t changed, but my understanding of their importance has deepened. It’s a reminder that no matter how advanced technology becomes, it all still rests on the same elegant principles we learned with coils, capacitors, and curiosity.
Sunday the 28th was our fourth class. We cover the basics of magnetism and inductance.
Inductance: From High School Chalkboards to Everyday Engineering
When I first studied inductance back in high school in the 1980s, it all felt a bit abstract. We were told that electricity and magnetism were linked in curious ways, that a simple wire could resist changes in current, and that this mysterious thing called “back EMF” always seemed to push back against whatever we tried to do. At the time, it felt like a trick of the math more than a tangible reality.
Fast forward to today, and I see those same fundamentals in every transformer, motor, and circuit I encounter. The equations on the chalkboard have become the backbone of modern engineering practice.
The Basics Haven’t Changed
Inductance, at its core, is about resistance to change. Any conductor carrying current produces a magnetic field, and when that current changes, the magnetic field changes with it. Faraday showed us that a changing magnetic field induces a voltage, and Lenz’s law tells us that voltage will always oppose the change that created it. That’s why a conductor, whether straight or wound into a coil, pushes back against changes in current flow.
In a classroom, this was demonstrated with the right-hand rule and magnetic field lines circling a wire. Winding that wire into a coil concentrated the flux, boosting its ability to oppose current changes. Back then it was a curious property. Today, I see it as nature’s built-in safety feature.
Inductors in Action
This week’s presentation walked through inductance in both DC and AC circuits. With DC, inductance delays the rise of current until the magnetic field stabilizes. With AC, the story is more dynamic: the magnetic field is always changing, always inducing a voltage that resists the flow. The result is what we call inductive reactance—an opposition that grows with frequency.
This dual behaviour explains why inductors pass DC easily but can choke out higher-frequency signals. It’s the same principle that makes them indispensable in filters, power supplies, and countless control applications.
From Inductors to Transformers
Once coils are involved, it’s a short step to transformers. Here, changing currents in one coil induce voltages in another, enabling us to step voltages up or down, match impedances, or isolate circuits entirely.
The elegance of the turns ratio—primary to secondary—never ceases to impress me. Whether it’s a massive utility transformer or a tiny toroidal inductor on a circuit board, the same rules apply: ratios matter, losses must be managed, and efficiency is king. Even after all these years, I find the beauty of this simple, reliable relationship remarkable.
Enduring Lessons
Looking back, what strikes me most is how little these fundamentals have changed. The names—Faraday, Henry, Lenz—still echo through the equations, but the applications have multiplied beyond what I could have imagined as a teenager.
In high school, inductance was just another physics unit to get through before exams. Today, I see it as a quiet constant in my daily work—woven into motors, relays, solenoids, and transformers, underpinning so much of the technology we depend on.
The fundamentals of inductance have aged better than we have: steady, dependable, and still as sharp as ever.
One of the cornerstones of electrical engineering and electronics is Ohm’s Law. It’s the simple yet powerful relationship that ties together voltage, current, and resistance—and it forms the basis for everything from circuit design to troubleshooting.
The Basics: Ohm’s Law
Ohm’s Law is expressed as:
E = I × R
Where:
E is voltage in volts,
I is current in amperes, and
R is resistance in ohms.
This means that if you know any two of these values, you can calculate the third. The “Ohm’s Law Triangle” (or circle, depending on your preference) is a handy memory tool—cover the unknown quantity, and the relationship between the other two tells you how to solve for it.
Putting It Into Practice
The presentation walks through practical examples:
12 V across 96 Ω → current is 0.125 A.
1.5 A through 15 Ω → voltage is 22.5 V.
550 mV across 0.1 A → resistance is 5.5 Ω.
200 mA through 2.5 kΩ → voltage is 500 V.
These worked problems highlight the importance of careful unit conversion—millivolts, milliamps, and kilohms need to be translated into base units to avoid errors.
Series and Parallel Resistors
A pile of random 1/8 W, 1/4 W and 1/2 W resistors of various constructions
The presentation also digs into combining resistances:
Series circuits: resistances simply add together. Current is the same everywhere, but the voltage divides across each resistor.
Parallel circuits: resistances combine using reciprocals, and the total is always less than the smallest branch resistor. Voltage across each branch is the same, while current divides according to resistance.
Worked examples show how to calculate total resistance, individual currents, and voltage drops in both series and parallel networks.
Power in Electrical Circuits
Beyond voltage and current, we need to understand power, which is the rate at which energy is converted into work. In electrical terms:
P = E × I = E² / R = I² × R
This is where the mnemonic PIE (Power = I × E) comes in handy. The unit of power is the watt, equal to one joule per second.
Examples demonstrate that whether you start with voltage and resistance, or current and resistance, the result is the same: a 12 V source across a 50 Ω resistor dissipates 2.88 W of power, no matter which formula you use.
Practical Considerations
Resistors aren’t just about resistance—they also have power ratings. Exceeding the rated wattage leads to overheating and failure. As a general rule, always design with a 50–100% safety margin.
The presentation also explores how power ratings are applied in series and parallel circuits, reinforcing the importance of checking each component’s limits before applying a load to a circuit.
Wrapping Up
The review questions at the end drive home the fundamentals:
Higher wattage bulbs consume energy faster.
Open circuits have no current.
Watts come from volts multiplied by amps.
Resistance equals voltage divided by current.
Ohm’s Law and the concept of power are simple tools, but they’re the foundation of every circuit we build, test, or repair. Whether you’re a student learning the basics or a professional brushing up, getting comfortable with these relationships is essential.
Today was the second class in the RAC Amateur Radio License course. We were studying Chapter Two of the Study Guide, Introduction to Electronics.
This chapter lays the foundation for understanding basic electricity, beginning with the fundamental components of matter. Before delving into the technical theory, Al discussed the Amateur Radio Operator Code of Conduct. I decided that the two codes Al mentioned were important enough to warrant their own post, so you can find them here.
We looked at atoms, their structure, and how the behaviour of electrons gives rise to electrical phenomena. From there, the discussion moved into conductors and insulators—why materials like copper and gold conduct electricity so well, while glass, rubber, and plastics resist it.
Niels Bohr’s model of the atom (1913)
Permittivity is a key idea in physics, especially in electromagnetism. It describes how a material reacts when an electric field is around. Understanding permittivity helps explain how electric fields work, as well as the functioning of capacitors, dielectrics, and electromagnetic waves.
Permittivity, denoted by the symbol ε (epsilon), is essentially a measure of how well a material can allow electric fields to pass through it. It tells us how much the electric field inside the material is weakened compared to what it would be in a vacuum or open space. This property depends on the material’s makeup, structure, and physical state.
Key ideas:
Electric Permittivity (ε0): This is the permittivity of free space, often called epsilon naught. It’s a fundamental constant that describes how electric fields behave in a perfect vacuum. Its value is about 8.854 × 10-12 farads per meter (F/m). In a vacuum, the electric field moves without distortion or loss.
Relative Permittivity (εr): Also known as the dielectric constant, this is simply the ratio of a material’s permittivity to that of free space. It’s dimensionless and tells us how well a material can store electrical energy in an electric field compared to a vacuum. Materials like glass, ceramics, and many plastics have high relative permittivity, making them very effective at energy storage.
Insulators or insulating materials are those substances which will not allow the flow of electrons through them due to very low free electrons in them, and they have a low dielectric constant (Relative permittivity = εr).
Examples: Porcelain insulators used in power transmission on distribution poles and towers, rubber, glass, plastic, wood, etc.
Dielectrics or dielectric materials are substances similar to insulators but allow the flow of electrons through them when subjected to an external electric field, as they can be polarized. They can also be defined as having the ability to store charge (energy) through polarization, as in a capacitor. Additionally, they have a high dielectric constant. (Relative permittivity = εr).
Examples: A common example of a dielectric is the electrically insulating material between the metallic plates of a capacitor, (such as mica, laminated paper). Other examples include air, ceramic, etc.
All dielectrics are insulators, but not all insulators are dielectrics.
Everything becomes a conductor at certain temperatures or electric fields due to breakdown, as every insulator has its limits to withstand a potential difference across the material.
Key electrical concepts were introduced, including charge, current, voltage, and resistance. Al Penny VO1NO, our instructor, explained the coulomb as the standard unit of charge, the ampere as the rate of electron flow, and voltage as the “pressure” that pushes electrons through a conductor. Resistance and the factors that affect it—material type, length, diameter, and temperature—are also covered, along with the role of resistors and potentiometers in circuits.
The class then explored magnetism as one of the four fundamental forces of nature, showing how magnetic fields, poles, and materials influence electrical behaviour.
The four fundamental forces
This naturally led to a discussion of direct current (DC), its sources, and the role of cells and batteries. Al explained the difference between primary (non-rechargeable) and secondary (rechargeable) cells, the chemistry behind common examples like zinc–carbon and lead–acid batteries, and how cells can be connected in series or parallel to change voltage or current capacity.
By the end, the chapter tied together the essential elements of electricity—atomic theory, conductors and insulators, current, voltage, resistance, magnetism, and electrochemical cells—providing a solid grounding for anyone beginning their journey into radio and electronics.
This was a review for me, as I have worked with electronics and electricity throughout my career. However, for anyone who doesn’t have a grounding in these subjects or feels like they need a refresher, this was a great place to start.
If you are thinking about studying for an Amateur Radio Certificate, there are some excellent flashcard decks available for free on Ankiweb. One that I am using is the ISED basic amateur questions (2025) deck.
Back in the early 2ks, Molson Brewery ran an ad campaign called “I AM.” One of the ads from that campaign really resonated across the country. Here it is in case you haven’t seen it.
Jeff updated the original ad during the recent “51st state” fracas between the US and Canada. The video and the image below effectively capture what being Canadian means to me. Jeff Douglas’s passion for our core values and the differences between Canadians and Americans resonates with me.
We Are Canadian by Jeff Douglas, 2025
Pierre Elliot Trudeau was one of Canada’s great Prime Ministers. His words, spoken in front of the Ukrainian-Canadian Congress in 1971, are still relevant today. Ironically, he said these words to a group of people, many of whom had fled the Soviet Union to escape the Holodomor in the 1930s. Canada has welcomed many Ukrainian refugees fleeing the war being waged on them by Russia. Canadians help people in need. That’s what we do.
Pierre Trudeau’s Remarks to the Ukrainian-Canadian Congress, 1971-10-09.
Finally, despite all the criticism, I like the video by Mark Carney and Mike Myers that was released just before the 2025 federal election.
We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
Cookie
Duration
Description
cookielawinfo-checkbox-advertisement
1 year
The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Advertisement".
cookielawinfo-checkbox-analytics
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checkbox-functional
11 months
The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checkbox-necessary
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-others
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-performance
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
viewed_cookie_policy
11 months
The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Cookie
Duration
Description
_ga
2 years
This cookie is installed by Google Analytics. The cookie is used to calculate visitor, session, campaign data and keep track of site usage for the site's analytics report. The cookies store information anonymously and assign a randomly generated number to identify unique visitors.
_gat_gtag_UA_3298355_7
1 minute
This cookie is set by Google and is used to distinguish users.
_gid
1 day
This cookie is installed by Google Analytics. The cookie is used to store information of how visitors use a website and helps in creating an analytics report of how the website is doing. The data collected including the number visitors, the source where they have come from, and the pages visted in an anonymous form.
CONSENT
16 years 4 months 16 days
These cookies are set via embedded youtube-videos. They register anonymous statistical data on for example how many times the video is displayed and what settings are used for playback.No sensitive data is collected unless you log in to your google account, in that case your choices are linked with your account, for example if you click “like” on a video.
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Cookie
Duration
Description
IDE
1 year 24 days
Used by Google DoubleClick and stores information about how the user uses the website and any other advertisement before visiting the website. This is used to present users with ads that are relevant to them according to the user profile.
test_cookie
15 minutes
This cookie is set by doubleclick.net. The purpose of the cookie is to determine if the user's browser supports cookies.
VISITOR_INFO1_LIVE
5 months 27 days
This cookie is set by Youtube. Used to track the information of the embedded YouTube videos on a website.
YSC
session
This cookies is set by Youtube and is used to track the views of embedded videos.
yt-remote-connected-devices
never
These cookies are set via embedded youtube-videos.
yt-remote-device-id
never
These cookies are set via embedded youtube-videos.