I am attempting to get SVDM working outside of an SBBS environment and am having some trouble with different versions of the files mentioned in the message title. Both SVDM01 and SVDM03 are showing the same results. All of the following errors are happening on XP SP3, Windows 7 SP1, and Windows 10 22H2. All are 32-bit OS versions using the native NTVDM. The copy of SBBSEXEC.DLL in System32 always matches the DLL version I am attempting to use.
The program I'm attempting to use to get the debug log entries is simply DSZ called with the following:
svdm.exe DSZ.EXE port 1 d t
Each program I've tried has the the same results described below.
Using the versions of SBBSEXEC.DLL and DOSXTRN.EXE included in the SVDM archives works properly when using FOSSIL or Int 14h communications. However, when attempting to use the UART emulation (which is what I am after) things fall apart. From a user's point of view the NTVDM simply locks up as soon as any communications are attempted. Looking into things using DbgView shows a number of nonsense reads and writes to the emulated UART registers followed by an endless stream of emulated hardware interrupts which also do not make sense. I can include a debug log of this, but frankly it's lengthy and is 100% reproducible.
Another oddity - When using the included SBBSEXEC.DLL, the UART emulation defaults to base address 0x000 and interrupt 0x0.
The lockups do not occur
when this is left as-is but none of the UART emulation does anything. Setting a ComPort in a [UART] section of SVDM.INI or SBBSEXEC.INI does change the I/O address and IRQ to an expected value. But this isn't necessary. The mere existence of one of those two INI files, even if empty, sets the UART to the expected default of 0x3F8/IRQ4. The debug entry is always as follows when left without any INI file:
[180] SBBS: Virtualizing UART (0x0, IRQ 0)
Changing the SBBSEXEC.DLL to the one included in the SBBS 3.19b update package does allow things to work mostly as expected. UART emulation and the virtual modem both work properly until pushed somewhat hard (such as a file transfer or sending a text file of more than a few screens).
Eventually a number of read errors occur, followed by failure to communicate with the UART at all. The NTVDM does not lock up but no further communications are possible. The debug log shows an endless stream of these entries when this has happened:
[208] SBBS: !input_thread: ReadFile Error 122 (size=0)
[208] SBBS: !input_thread: ReadFile Error 122 (size=0)
[208] SBBS: !input_thread: ReadFile Error 122 (size=0)
Using SBBSEXEC.DLL from SBBS 3.19c has the same behavior as when using the DLL included in the SVDM distributions.
Using the DOSXTRN.EXE file from SBBS 3.19b vs the 3.19c/SVDM distribution does not seem to make any difference in behavior. However, I don't want to add any mix-n-match problems into the equation.
Any ideas on which is the 'correct' combination of these files to use for reliable UART emulation with SVDM?
The latest and greatest sbbsexec.dll and dosxtrn.exe can be find in the nightly builds of Synchronet for Windows:
How did you determine the read/writes were "nonsense"?
Would happy to try address whatever issues with the UART emulation aren't working for you, but please update to the latest and get new/updated debug log output and share with me.
As I looked into what was going on I moved to lower and lower level diagnostic programs until I finally just wrote my own to know exactly what was being done on the program side.
Would happy to try address whatever issues with the UART emulation aren't working for you, but please update to the latest and get new/updated debug log output and share with me.
I had downloaded the latest SBBSEXEC.DLL the morning after you made the initialization change and have tried it out. It's working 100% along with the version downloaded today! Pushing the UART hard also no longer creates any errors or even any unusual debug log entries. Thanks again for fixing this.
There are a couple of other issues I would like to mention.
1. When SVDM uses an inherited socket (the -h option) no telnet negotiations are done. As a result, the connection is assumed to be in ASCII mode and server side CR characters are translated to CR/LF. Since most programs are already transmitting a CR/LF this gets translated to CR/LF/LF with the expected results. When using an external socket in telnet mode, could SVDM set the telnet.local_option and telnet.remote_option variables as so:
A. Assume both remote and local have already suppressed GA and set the two
options accordingly
B. Set the remote telnet echo option to off and set the local telnet echo
to follow the ServerEcho option from the .INI file
C. Set both remote and local BINARY_TX options to follow the ServerBinary option from the .INI file
I don't think it's unreasonable to assume these have already been set up when the telnet connection was initially made. If someone really wants to change the behavior they could still do so by using the .INI file options mentioned. The GA and echo options probably make no difference now but leaving them unset might cause trouble somewhere down the line.
2. Can anything be done to reduce the CPU usage?
3. The VDMODEM isn't importing target_ia32.props and thus is using SSE2 instructions.
Thanks yet again for all the work you've done on this and for fixing the issue I was having.
1. When SVDM uses an inherited socket (the -h option) no telnetI'll be committing a change here to address that - basically send the Telnet commands to re-negotiate those operating parameters (the same sequence that happens when answering an incoming Telnet connection).
negotiations are done.
I added 2 new .ini settings for you to play with:
- MainLoopDelay (default: 0, set to 1+ to add CPU yield)
- SocketSelectTimeout (default: 0, set to 1+ to add CPU yield)
Re: SVDM - Which SBBSEXEC.DLL and DOSXTRN.EXE version?
By: Digital Man to Fzf on Mon Mar 25 2024 04:27 pm
1. When SVDM uses an inherited socket (the -h option) no telnetI'll be committing a change here to address that - basically send the Telnet commands to re-negotiate those operating parameters (the same sequence that happens when answering an incoming Telnet connection).
negotiations are done.
It addresses the local configuration but unfortunately it still doesn't set remote options. The remote is usually going to be in binary mode but SVDM has the remote option set to ASCII by default. A CR from the remote then gets held up until a second byte is sent.
Sending a DO TX_BINARY near the WILL TX_BINARY when in ServerBinary mode and sending a DONT TX_BINARY when not in ServerBinary but using an external socket sets the remote options to appropriately match what SVDM is expecting. Clients might not like having their TX binary mode turned off mid session, but if someone is disabling binary mode on the server side they are already doing something weird.
It also sets the remote to binary when SVDM answers in listen mode. At the moment it leaves the remote TX in ASCII at all times.
I added 2 new .ini settings for you to play with:
- MainLoopDelay (default: 0, set to 1+ to add CPU yield)
- SocketSelectTimeout (default: 0, set to 1+ to add CPU yield)
These work perfectly, thanks! Just a simple 1 ms delay in the main loop drops CPU usage to 0% most of the time.
I also looked into the error 122 in the SBBSEXEC input_thread when SVDM gets pushed hard, such as during a file transfer. A little additional information on the next waiting mailslot message makes it pretty clear. Sorry, these are going to wrap oddly:
SBBS: !input_thread: ReadFile Error 122 (space=9411, count=0, nextsize=10000, waiting=46)
SBBS: !input_thread: ReadFile Error 122 (space=1211, count=0, nextsize=5056, waiting=45)
SBBS: !input_thread: ReadFile Error 122 (space=9635, count=0, nextsize=10000, waiting=26)
Etc. There's just not enough space in the ring buffer at the time.
these messages are harmless, the sheer number of them can help thrash a CPU pretty good right at a time when the CPU is busy. I changed the logging to log error 122 at a lower priority so it can be squelched out unless debugging is needed. That further drops the CPU usage when the SVDM is processing a lot of data.
Does your gitlab accept anonymous updates, or can I send you a diff?
Thanks again for all your work on this!
Sysop: | tracker1 |
---|---|
Location: | Phoenix, AZ |
Users: | 54 |
Nodes: | 25 (0 / 25) |
Uptime: | 156:28:22 |
Calls: | 367 |
Files: | 1,364 |
Messages: | 36,299 |