Buffer overflow occurs when an intruder floods an application's buffer by sending data that is larger than the (unchecked) buffer limit, allowing execution of malicious code. Both terms (unchecked buffers and buffer overflows) describe poor programming code by programmers who overlook security (the cause) by not verifying buffer data size and its consequence (the effect). With that said, servers (Web servers, database servers, etc.) are vulnerable to buffer overflow. Now is the time to patch servers, firewalls, routers, and so forth.
Dig Deeper on Network Administration
Related Q&A from Luis Medina
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.