Jon Oltsik, an analyst with Enterprise Strategy Group Inc., in Milford, Mass., examined the effect of new cybersecurity...
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
regulations from the state of New York's Department of Financial Services. The New York cybersecurity regulations overlap with many federal requirements, but are even more stringent, including provisions that companies must have a chief information security officer, or CISO, in place, as well as policies that dictate how nonpublic data must be protected and how data can be securely destroyed.
"At this point, the NY State DFS regulations are the most stringent (civilian) rules in existence. Thus, other countries, industries and states will have a keen interest in how they roll out, what challenges they present, and how they are modified in the future," Oltsik wrote in a blog post.
Among the new rules put forth in the New York cybersecurity regulations are requirements that entities establish and maintain "a cybersecurity program" and stipulate that organizations hire a CISO to oversee operations. Currently, Enterprise Strategy Group research indicates 67% of organizations already employ a CISO. The new regs also require the establishment of a security operations and analytics platform architecture, a well-documented incident response plan, audit logs and the monitoring of authorized users.
Oltsik indicated the new requirements will likely drive a wave of user behavior analytics and threat intelligence software sales. "Aside from cybersecurity people and technologies, the new rules ought to be [a] boon for lawyers. The DFS regulations are new, and so what to do and how to do it is up for some interpretation. This should keep NY-based cybersecurity-savvy attorneys busy for some time," Oltsik wrote.
Dig deeper into Oltsik's thoughts on the New York cybersecurity regulations.
The network edge will take on the cloud
Gartner analyst Tom Bittman said even as cloud computing "eats" data centers, it also will be "eaten" by the network edge. In Bittman's view, IT is experiencing a collision of trends, as cloud computing and IT centralization run into the internet of things, machine learning and augmented reality.
"The agility of cloud computing is great -- but it simply isn't enough. Massive centralization, economies of scale, self-service and full automation get us most of the way there -- but it doesn't overcome physics -- the weight of data, the speed of light," Bittman wrote in a blog post. In spite of the speed and efficiency of data centers, users must perform tasks in real time.
Bittman said he envisions a future that goes well-beyond today's edge computing, where software agents, processing and storage reside in a fortified network edge. Fueling that move: Cloud computing providers struggle to furnish the real-time needs of users, thereby driving a shift of workloads to the network edge.
"The time to have an edge strategy is very, very soon. Watch as VR [virtual reality] goggles start to take off, and heads-up displays in cars, and mixed reality apps on smartphones," Bittman wrote. "Enterprises have focused on cloud computing, and have been developing strategies to 'move to the cloud' or at least 'expand into the cloud.' It's been a one-way, straight highway. There's a sharp left turn coming ahead, where we need to expand our thinking beyond centralization and cloud," he added.
Explore more of Bittman's thoughts on the future expansion of the network edge.
Tackling ever-increasing network complexity
Ivan Pepelnjak, blogging in ipSpace, said he sees a lot of complexity in modern networking -- in part, due to vendors' product portfolios that add a lot of unnecessary difficulty to the equation. But Pepelnjak said vendors aren't the only one at fault.
Indeed, he said, much of that complexity is driven by customers themselves -- either by not being critical enough to plow through the marketing hype or by being too scared to know what they want. As a result, he said, networking teams may fail to develop or invest in robust network designs.
"Is it fair to make fun of the complexity-ridden legacy vendors? Well, it definitely makes for fun reading, but maybe we should just respect old age while at the same time telling the dinosaurs it's time to change by voting with our wallet," Pepelnjak wrote. He added that in spite of the current complexity in networking, it is still possible to design "robust large-scale" networks if engineers understand how their technology works and choose the best tools to manage it.
Read more of Pepelnjak's thoughts on network complexity.
Looking into N.Y. cybersecurity laws
IoT that relies on local intelligence
Tackling network complexity in the cloud