Posted inSecurity

5 best practices for a more secure network

There are two approaches to the modern landscape. Not to oversimplify, but you either throw your arms up or roll up your sleeves

Network security was never the simplest of propositions. The rise of the Internet, and then the mobile Internet, and then Web applications, and then portals for file uploads, and then mass cloud migration, and then remote work, and then, and then… A series of complex challenges that arose in pursuit of progress has led to a ballooning attack surface, more cybersecurity risk, and the need for a reevaluation of how organisations should be protecting themselves.

Without this reevaluation, just imagine for a moment what could happen. Cybercriminals dropping malware through social engineering and exploitation of vulnerabilities. Data leaks of brand-shattering and regulator-enraging proportions. Ransomware attacks, zero-day attacks, distributed denial-of-service (DDoS) attacks, man-in-the-middle (MitM) attacks — a never-ending assault from which there would be no shelter.

In the final analysis, there are two approaches to the modern landscape. Not to oversimplify, but you either throw your arms up or roll up your sleeves. The former option is a highway to harm. Assuming you choose the latter, you can survive and thrive. And here are five best practices to help you on your way.

Sertan Selcuk, VP of Sales, META at OPSWAT

1. Solutions that meet needs

This may seem obvious, but you would be surprised how many organisations across the region throw money at point solutions that are not relevant to their business models. Caveat aside, there are many network security solutions that are helpful in thwarting nefarious activity. Forward/reverse proxies shield connections and hide IP addresses, enforcing single points of entry and exit. Load balancers do a similar job and boost reliability, scalability, and efficiency. Web application firewalls (WAF) screen network packets for malicious intent, combatting threats like cross-site scripting, database injection, and DDoS.

Next-generation firewalls focus on traffic moving through the network and transport layers, while ingress controllers are like load balancers but designed specifically for container environments such as Kubernetes. Secure Web gateways (SWGs) are filters between users and the Internet, blocking unauthorised site access and encrypting traffic. Application delivery controllers (ADCs) offer proxy, load balancing, and WAF functionality, as well as an API gateway and application acceleration. And managed file transfer (MFT) solutions grant visibility of and control over large file transfers, in contrast with FTP or HTTP, which have no inherent security functionality.

2. Scan everything, in transit and at rest

Web applications, compromised servers, internal or external file transfers, and a range of other activities can be used by threat actors to gain a foothold and move laterally across the network. Organisations must scan all these files, of course. But they must go further. They must ensure only specific file types are permitted to cross their boundaries. No executables, scripts, or other potentially malicious content. This practice includes looking for prohibited files masquerading as permitted types. To find malware, it is recommended that files are scanned with multiple anti-malware engines, to cover a range of signatures, heuristics, and machine-learning models.

Require users to authenticate themselves before uploading a file. Among common (and necessarily permitted) file types such as Microsoft Office and PDF, watch for embedded threats in hidden scripts and macros. While multiscanning technology can scan for threats, another methodology called content disarm and reconstruction (CDR) is the most effective means of doing this. Check for vulnerabilities in files prior to upload. Limit filename and size and randomly change the names and store the files somewhere other than the Web root folder, so attackers cannot access them after upload. Also, use simple error messages; do not include directory paths, server configuration settings, or other information that attackers could potentially use.

3. Cover the known and the unknown

OPSWAT research shows it takes more than 20 anti-virus and anti-malware engines to consistently reach detection rates greater than 99%. Content disarm and reconstruction (CDR) can remove potentially malicious content like macros and URLs and then identify any embedded content. It removes any content it considers to be potentially malicious and reconstructs the file using only legitimate components. Because of its zero-trust foundation, CDR technology is highly effective at preventing unknown threats, including zero-day targeted attacks and threats that specialise in malware evasion. Because of its reconstructive approach, CDR is also effective in addressing file-based vulnerabilities and in getting rid of malicious code embedded in scripts and macros.

4. Redact sensitive information

The GCC is awash with regulatory requirements, from industry standards such as PCI to international frameworks like GDPR to numerous local-level obligations such as those imposed by the UAE’s Personal Data Protection Law. Redacting sensitive information in uploaded and transmitted files is the ultimate mitigation of the worst-case scenario.

5. Enhance efficiency with ICAP

The Internet Content Adaptation Protocol (ICAP) is designed for specialised services. ICAP enables networking appliances to offload network packets to a dedicated server for data sanitisation or security scanning to ensure all files are screened under the same policy. ICAP solutions enable network appliances to focus on their primary functions, such as network performance, while allowing the ICAP service to rapidly assess, analyse, and sanitise files passing through the system.

ICAP was created to balance the need for high performance with the need for deep content inspection. It was designed as a lightweight, open protocol, and because it provides a method for security appliances to divert specific network traffic to a dedicated server for deep scanning, it frees up resources on the security device that offloaded the data, allowing a robust security posture that does not impinge on performance.

Practice makes perfect

By following the practices outlined, organisations will see a marked reduction in risk without a noticeable degradation in performance. Network security has always been complex, and there are many signs that this will be the case for many years to come as IoT, blockchain, and the Metaverse become everyday elements of the corporate technology ecosystem. But by ironing out the basics and establishing strong processes and tools suites now, the future becomes a lot less daunting.