animind - Fotolia

Get started Bring yourself up to speed with our introductory content.

SDN's role in enhancing network security: What will it be?

SN blogs: How can SDN be used to fortify network security? And is it time to be worried about software libraries?

Enterprise Strategy Group senior principal analyst Jon Oltsik says this week's VMworld will spotlight the role software-defined networking could play in enhancing network security. An ESG survey of network security pros found five key areas where SDN might help, he writes. The first is using SDN to selectively block malicious traffic to endpoints while allowing normal traffic flows. The second is using the architecture to improve network policy auditing and conflict detection and resolution. SDN could also be used to centralize network security policies. Oltsik says that even though SDN is still evolving and that tangible security benefits are still in the future, security pros are already determining how those benefits will align with enterprise requirements.

Find out some of the other benefits ESG believes SDN may provide security pros.

Technology leaders face critical time

The next two years will be a critical time for technology leaders, writes Nemertes Research CEO Johna Till Johnson. The firm's annual conference, held late last month, illustrated some of the challenges tech executives will face, fueled by the evolution of information technology into enterprise technology. That shift means companies will have to embrace such innovations as intelligent networks and machine-to-machine technologies. Cloud, mobility and collaboration services are also reshaping how companies operate, Johnson wrote. Finally, leaders must adapt to a changing workforce as baby boomers retire and millennials assume more important roles. To grow effectively, Johnson advises companies take a page from companies like Ford Motor Co. to launch innovative programs. Additionally, she says companies need to take a close look at emerging technologies like DevOps, wearables and network functions virtualization. Finally, she writes that the next generation of workers can't be expected to oversee legacy systems. Instead, they should be encouraged to use evolving methodologies and then trained to ensure those methods match corporate goals.

See what other tasks technology leaders will be facing.

Who's to blame when change is resisted?

That's the provocative question asked by Ethereal Mind blogger Greg Ferro, in discussing whether it's people or the system most responsible for blocking change. To Ferro, it's definitely the system. People, he writes, are hamstrung by IT leaders who won't take the necessary risks or adopt initiatives that support fast-moving IT processes. "Workers can't change the system that is forced onto them, and that's a decision made by poor leaders in IT," Ferro writes. "There are those who are resistant to change, but much of the blame for that lies with executives who don't want to change the system." Ferro takes particular aim at the Information Technology Infrastructure Library (ITIL), which, he says, is failing because of the advent of convergence. The bottom line? "Operational processes are driven to change while ITIL prevents it."

Read Ferro's other comments about ITIL and the impact of convergence.

When good libraries go bad

Gartner Inc. Research Director Mario de Boer wonders about software libraries -- in particular, how vulnerable those ubiquitous pieces of code might be to hackers. For his purposes, de Boer examined zlib. Not because it's insecure -- the last time it released a security patch was nine years ago -- but because it's the type of library that could cause serious problems if it ever were infected by malicious code.

"Security of such ubiquitous libs is of paramount importance," he writes. "A vulnerability impacts many applications and devices at the same moment. Applications usually do not share the libs, so this would result in many applications and devices being patched."

Complicating library security is that nobody is quite sure that the libs were developed securely and tested before publication, de Boer said. If integrators are in fact testing them, that's fine, but that's a very inefficient process. Instead, he calls for "transparent" and "rigorous" security testing of ubiquitous libs, suggesting it be a collaborative effort.

Check out de Boer's thoughts about lib security.

This was last published in August 2014

Dig Deeper on Network Security Best Practices and Products

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.