What is Debugging?

R. Kayne

Debugging refers to a process in software development whereby program analysts comb through computer code looking for “bugs” — the source of errors, flaws or security holes in the internal program instructions. Hardware development also goes through debugging to ensure compatibility with current hardware standards and interoperability between components that adhere to the same protocols. Additionally, debugging guarantees that hardware and software is backward compatible, or will coexist with preexisting standards that might still be in use.

The strengths and weaknesses of new products may be tested by beta testers.
The strengths and weaknesses of new products may be tested by beta testers.

Software debugging takes place in two phases. The first phase is known as “alpha testing,” and is performed in-house before the software is made public. The second phase is carried out through a public process known as “beta testing.” Beta testers are voluntary computer enthusiasts that use beta software at their own risks under agreement that errors or problems will be reported to developers.

Since bugs must be identified before they can be patched, software needs to be tested under various conditions.
Since bugs must be identified before they can be patched, software needs to be tested under various conditions.

Since bugs must be isolated and identified before they can be patched, the first step is to test the software under various conditions. When a bug reveals itself, the debugger takes note of the exact conditions under which the bug appeared, including the current running function, operating system type and version, and other software or hardware components that might be relevant. Public beta testers submit detailed reports online listing pertinent details accordingly, typically by filling out a pre-designed form.

Once both debugging phases are completed, the software program is ready for a general release as a stable version. However, debugging continues as a maintenance protocol for the life of the product, intensifying with major upgrades.

Hardware is debugged before it reaches the market and does not undergo real-world beta testing per se, as this would be too expensive and problematic for a number of fairly obvious reasons. Instead, most hardware manufacturers provide an online interface where users can get technical support or report problems with hardware. In many cases these problems turn out to be user error, but the process also serves to reveal bugs that were not caught in the initial debugging phase. Debuggers can go back to the instructions encoded in the controlling chips and make changes to rid the hardware of the bugs. The manufacturer can then provide a firmware upgrade that users can download online to update their hardware.

While in-house debugging can certainly rid software and hardware of many bugs, nothing replaces real-world testing. It is virtually impossible for an author or manufacturer to replicate every conceivable condition and system under which the hardware or software will be used. Many experienced IBM-PC users wait for a period of 12-36 months before migrating to a new operating system for this reason, such as making the switch from Windows™ XP™ to Windows Vista™. This gives the community time to identify any major security problems, bugs or other initial problems that might require debugging and patching.

Debugging guarantees that hardware and software are backward compatible.
Debugging guarantees that hardware and software are backward compatible.

You might also Like

Discussion Comments


why is debugging only used for software?


Annon 17004 and 6177. I'm having the same issues and was wondering why all of a sudden my ox wants to "debug" every site I go to. This is something new for me and would like some answers also.


Must I debug when prompted by my computer?

Can i do any harm by debugging? To debug must i click the 'break' button? I that all I would have to do to debug?


You have a maintenance program scheduled to run that is likely not debugging but doing something else (maybe defragging like the other poster suggests -- and yes there are programs that will defrag on a schedule but they usually do it when the computer is idle and shouldn't shut processes down). It might also be running a spyware or virus scan. Go to Control Panel > Scheduled Tasks and see what's on tap. You can also use a startup manager (lot of free ones) to see what programs are running at boot and disable the offending program, or open it and reconfigure it to not run automatically.


what can i do to stop the debugging always appearing in my computer?


Anon6177 -- Do you mean "debug"? Or do you mean defrag? Although to defrag, I think you have to manually run it...I don't think you can set your computer to defrag daily. Can you? Or probably you mean that your virus scanner is scheduled to run daily. But that doesn't close down the internet. Regardless, whatever program is operating, you should note it, go into it and change the configurations for when it is set to run. You can set it to a time when you aren't on the computer -- that way it won't interfere with your work.


Why does my computer keep having to debug? Does it on a daily basis - thus closing down the internet and losing information.

Post your comments
Forgot password?