One of the more esoteric aspects of ham radio is the use of "digital" modes. Which are, in essence, much like the old days of a dial up modem which used phase and or frequency shift keying of a carrier over a POTS line to transfer digital data. In the ham radio realm the RF is modulated in various fashions to communicate data digitally. As with the modem, various modes or protocols are used, a couple common ones being PSK31 and JTG65. I have begun looking into these modes and have been doing some research into the topic. This post is in part a tutorial of what I have learned from reading various sources that are scattered about the Internet and an open discussion of the topic for those who are knowledgeable on the subject and for those who would like to learn more.
In the typical setup, a computer sound card is placed between the computer and the radio and is used to digitize the received waveform for computer analysis and and generate the analog waveform that will be used as a modulating signal by the radio for transmission. The generic sound card for this application is called signal link and it acts as a USB based sound card. Some radios, such as the Yaesu FT-991(a) that I am running have a built in sound card (the FT-991 / 991A have a TI chipset for this function) that is used as a USB based sound device. For operation under Windows, Yaesu provides drivers and similar ones are provided under Linux. As I am primarily a Linux user this discussion will focus on interfacing under Linux. One side note, if operating under Windows install the drivers BEFORE you plug the radio into your computer.
Under Linux, all of the devices are virtualized under the /dev tree. Using the command "lsusb" will show you the list of USB devices attached to your machine. For example, on the machine I am writing this post on, one line of the output is:
UDEV is a kernel level tool that works according to a set of rules to assign consistent naming to devices. We can write a UDEV rule to give these USB sound card(s) a name like "rig". See this tutorial for information on doing this. Note that you will probably have TWO sound cards or codecs, one for transmitting and one for receiving which will be on different ports.
Once we have the device, we can configure the first of the applications that we will be using, called Hamlib. Each of the radios have their own set of commands that can be issued by a computer to perform the various functions, such as tuning to a particular frequency. The Hamlib program attempts to provide an abstraction layer that goes between each of the radios and the upper layer application and provide a common set of interface commands. Hamlib provides for control of TWO items: the 'rig' or the radio and a rotator. The second half of the tutorial I linked to above covers a basic configuration of Hamlib under Linux.
Once you have the abstraction layer set, you can use it with another application, such as FLDigi, which is available for multiple PC platforms, including Windows, Linux, and I believe MacOS. This is the "work horse" that will work with the data and perform the digital conversion. In addition to interfacing to Hamlib, the FLDigi also has XML files available associated with the command set of the radios. At this point, I am not really certain as to where the functionality of one ends and the other begins. From what I understand there are a few different means of controlling the radio, Hamlib being one. Another option is Rig Cat or even direct maniupulation via the HW PTT function in FLDigi (uses the XML file? ). See this page from the FLDigi user's manual for some examples of these options. Note that it makes reference to the USB devices in the /dev tree, which is why I discussed the importance of knowing what they are and considered assigning them persistent names.
This is about as far as I've gotten in my study of the setup, but there are a couple of other things that should be noted. Under the Linux system, you will need a Sound Server. Linux comes built in with ALSA in the kernel but a more feature rich option that has several features of benefit to control of a ham radio is Pulse-Audio, and another would be OSS. Note in this tutorial page on configuring FLDigi where it shows using Pulse Audio as well as configuring the modem to transmit on both the right and left channels. This same tutorial also shows configuring FLDigi to use the XML file and HAMLib combined (see paragraph above).
Lastly, the tutorial mentions that the Codecs (for the FT991 and likely other Yaesu radios) in the hardware list as "PCM2903B Audio CODEC". One of the things you need to be sure to do is set the audio mixer, using a tool such as pavucontrol, which is an optional add on for Pulse Audio to use these codecs as the input and output devices (as compared to say the built in PC speaker and microphone) and properly set the volume levels.
That is as far as I've gotten so far. I spent a large part of yesterday, a rainy Sunday, reconfiguring a small laptop with (my favorite distribution Arch) and getting Hamlib and FLDigi compiled and installed. I still need to install the Pulse audio and then investigate the the drivers and configure the device naming. As with most things Ham radio, it is an adventure and if it were easy to master it in an hour, it wouldn't be any fun.
In the typical setup, a computer sound card is placed between the computer and the radio and is used to digitize the received waveform for computer analysis and and generate the analog waveform that will be used as a modulating signal by the radio for transmission. The generic sound card for this application is called signal link and it acts as a USB based sound card. Some radios, such as the Yaesu FT-991(a) that I am running have a built in sound card (the FT-991 / 991A have a TI chipset for this function) that is used as a USB based sound device. For operation under Windows, Yaesu provides drivers and similar ones are provided under Linux. As I am primarily a Linux user this discussion will focus on interfacing under Linux. One side note, if operating under Windows install the drivers BEFORE you plug the radio into your computer.
Under Linux, all of the devices are virtualized under the /dev tree. Using the command "lsusb" will show you the list of USB devices attached to your machine. For example, on the machine I am writing this post on, one line of the output is:
corresponding to the wireless receiver for my mouse and keyboard. Note the 046d:c52b which correspond to the vendor and product ID numbers as assigned by the USB consortium. Your USB sound card(s) will have a similar ID. In order to use the sound card in the digital mode applications you need to know what the device is in your /dev tree. In all likelihood they will be assigned an identifier such as /dev/ttyUSB0 where the tty is a relic from the old RS232 modem days. As this will depend upon in what order they are enumerated it would be preferable to assign them a consistent name and for this we make use of the UDEV process.Bus 001 Device 006: ID 046d:c52b Logitech, Inc. Unifying Receiver
UDEV is a kernel level tool that works according to a set of rules to assign consistent naming to devices. We can write a UDEV rule to give these USB sound card(s) a name like "rig". See this tutorial for information on doing this. Note that you will probably have TWO sound cards or codecs, one for transmitting and one for receiving which will be on different ports.
Once we have the device, we can configure the first of the applications that we will be using, called Hamlib. Each of the radios have their own set of commands that can be issued by a computer to perform the various functions, such as tuning to a particular frequency. The Hamlib program attempts to provide an abstraction layer that goes between each of the radios and the upper layer application and provide a common set of interface commands. Hamlib provides for control of TWO items: the 'rig' or the radio and a rotator. The second half of the tutorial I linked to above covers a basic configuration of Hamlib under Linux.
Once you have the abstraction layer set, you can use it with another application, such as FLDigi, which is available for multiple PC platforms, including Windows, Linux, and I believe MacOS. This is the "work horse" that will work with the data and perform the digital conversion. In addition to interfacing to Hamlib, the FLDigi also has XML files available associated with the command set of the radios. At this point, I am not really certain as to where the functionality of one ends and the other begins. From what I understand there are a few different means of controlling the radio, Hamlib being one. Another option is Rig Cat or even direct maniupulation via the HW PTT function in FLDigi (uses the XML file? ). See this page from the FLDigi user's manual for some examples of these options. Note that it makes reference to the USB devices in the /dev tree, which is why I discussed the importance of knowing what they are and considered assigning them persistent names.
This is about as far as I've gotten in my study of the setup, but there are a couple of other things that should be noted. Under the Linux system, you will need a Sound Server. Linux comes built in with ALSA in the kernel but a more feature rich option that has several features of benefit to control of a ham radio is Pulse-Audio, and another would be OSS. Note in this tutorial page on configuring FLDigi where it shows using Pulse Audio as well as configuring the modem to transmit on both the right and left channels. This same tutorial also shows configuring FLDigi to use the XML file and HAMLib combined (see paragraph above).
Lastly, the tutorial mentions that the Codecs (for the FT991 and likely other Yaesu radios) in the hardware list as "PCM2903B Audio CODEC". One of the things you need to be sure to do is set the audio mixer, using a tool such as pavucontrol, which is an optional add on for Pulse Audio to use these codecs as the input and output devices (as compared to say the built in PC speaker and microphone) and properly set the volume levels.
That is as far as I've gotten so far. I spent a large part of yesterday, a rainy Sunday, reconfiguring a small laptop with (my favorite distribution Arch) and getting Hamlib and FLDigi compiled and installed. I still need to install the Pulse audio and then investigate the the drivers and configure the device naming. As with most things Ham radio, it is an adventure and if it were easy to master it in an hour, it wouldn't be any fun.