Categories
Life with GDPR

Life With GDPR: Karen Moore on The EU, Corporate Sustainability Due Diligence Directive

Tom Fox and Jonathan Armstrong, renowned expert in cyber security, co-host the award-winning Life with GDPR. Jonathan is on a short hiatus and in this episode, we have a special guest, Karen Moore who discusses the EU’s Corporate Sustainability-Due Diligence Directive.

Karen Moore is a well-versed professional in the area of impact assessments and due diligence, with a particular focus on human rights and environmental issues to prevent and address potential harm. Her perspective, shaped by her extensive experience, is that impact assessments and due diligence are key indicators of a corporation’s commitment to preserving the environment and upholding human rights.

Moore emphasizes the importance of these processes not only within a company’s own activities, but also within those of its suppliers and indirect suppliers. She stresses the need for a robust due diligence process, including tracking progress, publishing annual statements, implementing complaints procedures, and involving all employees.

Additionally, she highlights the challenges of managing these processes, such as complex questionnaires for third-party suppliers and the need for streamlined assessments. She believes in a proactive approach to corporate responsibility, going beyond regulatory requirements to foster sustainable practices and ethical decision-making.

 Key Takeaways:

  • Ethical and Sustainable Business Practices Compliance Guidelines
  • Ethical Evaluation for Data Privacy Compliance in the US
  • Ethical Data Handling for GDPR Compliance
  • Ethical Business Practices in Supply Chains

 Resources:

Connect with Tom Fox

Connect with Jonathan Armstrong

Connect with Karen Moore

Categories
Blog

Insights on the EU Corporate Sustainability Due Diligence Directive from GDPR

Regarding corporate social responsibility and data protection, impact assessments and due diligence can seem like a labyrinth of legal jargon and regulatory requirements. However, understanding the importance of these processes is crucial for any corporation looking to not only comply with regulations but also build trust with customers and stakeholders. In this blog post, we will dive into the intricacies of impact assessments and due diligence, answering common questions and providing practical tips for corporations navigating the complexities of the Corporate Sustainability Due Diligence Directive (CSDDD).

We will consider the following questions:

  1. What role does GDPR compliance play in navigating the complexities of the CSDDD?
  2. Why are privacy impact assessments important for the CSDDD?
  3. How can corporations comply with the CSDDD?

In the ever-evolving landscape of corporate responsibility and ethical governance, staying ahead of regulatory directives is crucial for businesses looking to comply and positively impact society and the environment. One such directive that is making waves in the corporate world is the CSDDD. In the wake of its near full adoption by the European Council, the implications of this directive are profound, prompting organizations to rethink their approach to sustainability, human rights, and environmental impact.

The parallels between the CSDDD and the General Data Protection Regulation (GDPR) serve as a reminder of the importance of proactively addressing ethical considerations within corporate governance. Just as with the GDPR, which focuses on data privacy and protection, the CSDDD underscores the necessity of corporate diligence in ensuring environmental responsibility, human rights protection, and fair business practices.

GDPR compliance is a critical component of navigating the complexities of the CSDDD. GDPR sets strict guidelines for how companies handle the personal data of EU citizens. By ensuring compliance with GDPR regulations, corporations can demonstrate their commitment to data protection and privacy, essential for building trust with customers and stakeholders in today’s data-driven world. One of the key components of GDPR compliance is to conduct regular audits of your data processing activities to ensure compliance with GDPR requirements. Implement robust data protection measures, such as encryption and access controls, to safeguard personal data and mitigate the risk of data breaches.

The essence of both GDPR and CSDDD is to take a proactive approach to compliance. By instilling a culture of responsibility within the organization, companies can effectively navigate the complexities of regulatory frameworks like the CSDDD. From conducting impact assessments to tracking progress and publishing annual statements, the directive emphasizes transparency and accountability in corporate operations.

Compliance with the CSDDD requires a proactive approach to data protection and privacy. Corporations must establish robust data governance frameworks, implement privacy-by-design principles, and regularly audit their data processing activities. By prioritizing data protection and privacy, corporations can demonstrate their commitment to responsible data management and build trust with customers and stakeholders. You should work to develop a data protection policy that outlines your organization’s commitment to data protection and privacy. Train employees on data protection best practices and provide ongoing support to ensure compliance with the CSDDD.

This is also true of privacy impact assessments (PIAs), essential for identifying and mitigating privacy risks associated with data processing activities. By conducting a PIA, corporations can assess the potential impact of their data processing activities on individuals’ privacy rights and take steps to minimize any adverse effects. PIAs are especially important in the context of the CSDDD, where data protection and privacy are paramount concerns. You should work to integrate privacy impact assessments into your data processing workflows to identify and address privacy risks proactively. Engage with data protection authorities and stakeholders to ensure transparency and accountability in your privacy practices.

While the CSDDD is a European directive, its reach extends beyond the EU’s borders, impacting US companies with significant operations or income derived from the region. This broad scope necessitates a thorough evaluation of supply chains, supplier relationships, and potential risks associated with non-compliance. The CSDDD’s requirements for due diligence and supplier engagement underscore the interconnected nature of global business operations.

As organizations strive to align with the CSDDD, integrating existing laws and guidelines from related legislation, such as GDPR, becomes essential. From incorporating OECD guidelines to addressing human rights and environmental impact, companies must adopt a comprehensive approach to compliance. By leveraging technological solutions and strategic staffing, businesses can streamline their compliance efforts and enhance their impact on society and the environment.

The convergence of directives like the CSDDD and GDPR heralds a new era of ethical governance for businesses worldwide. By embracing the principles of sustainability, human rights protection, and environmental stewardship, organizations can meet regulatory requirements and contribute to a more responsible and equitable corporate landscape. As we navigate the complexities of corporate responsibility, let us heed the lessons from these directives and strive to do the right thing, both ethically and legally.

Navigating the complexities of impact assessments and due diligence in the context of the CSDDD may seem daunting. Still, with a proactive approach to data protection and privacy, corporations can demonstrate their commitment to responsible data management and build trust with customers and stakeholders. By prioritizing GDPR compliance, conducting privacy impact assessments, and implementing robust data protection measures, corporations can navigate the complexities of the CSDDD effectively.

Categories
TechLaw10

TechLaw10: Eric Sinrod & Jonathan Armstrong on 5 years of GDPR

In this edition of TechLaw10,  Jonathan Armstrong talks to Attorney and Professor Eric Sinrod from his home in California. They discuss the fifth anniversary of GDPR coming into force.

The topics include:

  • What are the fine levels under GDPR?
  • The use of Data Protection Impact Assessment
  • Cookies
  • AI
  • Data Transfer
  • The rise of GDPR-like legislation around the world
  • The future of GDPR

You can listen to earlier TechLaw10 audio podcasts with Eric and Jonathan at https://www.duanemorris.com/site/techlaw10.html

Connect with the Compliance Podcast Network at:

LinkedIn: https://www.linkedin.com/company/compliance-podcast-network/
Facebook: https://www.facebook.com/compliancepodcastnetwork/
YouTube: https://www.youtube.com/@CompliancePodcastNetwork
Twitter: https://twitter.com/tfoxlaw
Instagram: https://www.instagram.com/voiceofcompliance/
Website: https://compliancepodcastnetwork.net/

Categories
Blog

Solar Winds Under GDPR: Corporate Responsibility and Risks in Data Protection

The General Data Protection Regulation (GDPR) has significantly changed how organizations handle data protection and privacy. It emphasizes the importance of transparency and honesty in disclosing data breaches and vulnerabilities. In a recent episode of the podcast Life with GDPR, Tom Fox and Jonathan Armstrong from Cordery Compliance discussed the topic of corporate responsibility and risks in data protection, with a particular focus on the SolarWinds case.

To recap, in late 2023, the SEC filed a lawsuit against SolarWinds Corp and its CISO, Tim Brown, following the 2020 data breach, bringing the issue of executive liability in cybersecurity disclosures to the forefront. The lawsuit raised important questions about the personal liability of senior executives for inaccurate risk disclosures and has potential implications for other industries under US securities law.

The 2020 breach, orchestrated by Russian hackers, targeted SolarWinds’ software, Orion, and exposed highly sensitive information. The hackers gained access to SolarWinds and planted spyware in the Orion program. SolarWinds then distributed an update to its corporate customers, unknowingly spreading the Russian spyware. This allowed the hackers to access the highest levels of the US government and major corporations.

The SEC’s lawsuit against SolarWinds and Tim Brown focused on the poor disclosures about the company’s information security throughout 2018, 2019, and 2020. While SolarWinds publicly claimed to have good cybersecurity, internal communications revealed that employees were aware of the company’s cybersecurity issues and considered them a mess. This discrepancy between internal knowledge and external disclosures formed the basis of the SEC’s allegations.

The SEC complaint alleged that SolarWinds’ public statements about its cybersecurity practices and risks were at odds with its internal assessments, including a 2018 presentation prepared by a company engineer and shared internally, including with Brown, that SolarWinds’ remote access set-up was “not very secure” and that someone exploiting the vulnerability “can do whatever without us detecting it until it’s too late,” which could lead to “major reputation and financial loss” for SolarWinds. Similarly, as alleged in the SEC’s complaint, 2018 and 2019 presentations by Brown stated, respectively, that the “current state of security leaves us in a very vulnerable state for our critical assets” and that “[a]ccess and privilege to critical systems/data is inappropriate.”

Beyond this SEC enforcement action, there were other implications as well. One key takeaway from the episode is the pressure on corporate leaders, including CISOs, Data Protection Officers, and Compliance Officers, to disclose data breaches promptly. While GDPR offers some protection to Data Protection Officers, they are not entirely exempt from liabilities. The SolarWinds case serves as a reminder of the need for specific and timely disclosure of breaches and the importance of addressing system vulnerabilities.

The risks associated with data breaches are not limited to regulatory fines. Litigation risks are a significant concern for organizations, with shareholders and whistleblowers potentially seeking legal action. The episode highlights the importance of transparency and not misrepresenting information to regulators. Misrepresentations can lead to severe consequences for individuals in positions of responsibility within corporations.

Budget constraints can also hinder the timely fixing of vulnerabilities, ultimately leading to breaches. Organizations need to take proactive measures to identify and address vulnerabilities promptly. Realistic resource assessments are crucial to ensuring that adequate resources are allocated to data protection efforts. Additionally, having adequate insurance protection, such as Directors and Officers (D&O) insurance, can help protect individuals in positions of responsibility from potential liabilities.

The episode also emphasizes the need for organizations to consider the impact on their stock exchange filings when deciding whether to disclose a data breach. The decision to admit a violation of a stock exchange can be challenging and depends on factors such as materiality. Organizations need to assign a dedicated team to consider these factors, mainly when engaged in transactions like mergers and acquisitions or fundraising.

Transparency and honesty are key principles in data protection and privacy. Audit reports and investigation findings must be acted upon promptly to address vulnerabilities. Emails and other forms of communication can serve as evidence in legal proceedings, highlighting the importance of careful communication within organizations.

The potential for litigation is significant in data breach cases. Shareholders may seek legal action if they believe the value of their stock has been affected. Whistleblowers, incentivized by various jurisdictions, may also come forward with information. This highlights the need for organizations to maintain a culture of transparency and integrity and for individuals to review their remuneration packages to avoid conflicts of interest.

In conclusion, GDPR, corporate responsibility, and risks in data protection are interconnected. Organizations must prioritize transparency, honesty, and timely disclosure of breaches and vulnerabilities. Proactive measures, realistic resource assessments, and adequate insurance protection are crucial to mitigating risks. By considering the impact on stock exchange filings and maintaining a culture of integrity, organizations can navigate the challenges associated with data protection and privacy in the GDPR era.

Categories
Life with GDPR

Life With GDPR: Critical Perspectives on Big Law Firm Cybersecurity

Tom Fox and Jonathan Armstrong, renowned experts in cyber security, co-host the award-winning Life with GDPR. In this episode, they look at a breach of a big law.

In the wake of a recent spearphishing attack and data breach at a UK law firm, the legal community is abuzz with discussions on the responsibility of lawyers to prevent such attacks. Tom Fox, known for his critical perspective on big law firms, highlights the mistakes made by the firm in question, emphasizing the increasing concern over cyber-attacks targeting law firms and the need for timely reporting to regulatory authorities. Jonathan Armstrong, on the other hand, underscores the importance of proactive cybersecurity measures and timely reporting, commending the firm for taking immediate action but criticizing the delay in reporting the breach. Both Fox and Armstrong bring their unique perspectives shaped by their experiences in the field. Join them on this episode of the Life with GDPR podcast as they delve deeper into this topic.

Key Takeaways:

  • A spearphishing Attack Leads to Data Breach
  • Cybersecurity Measures for Law Firms
  • The Power of Dedicated Data Protection Training

 Resources:

For more information on the issues raised in this podcast, check out the Cordery Compliance News Section. For more information on Cordery Compliance, go to their website here.

Also, check out the GDPR Navigator, one of the top resources for GDPR compliance, by clicking here. Check out the Cordery Data Breach Academy here.

Connect with Tom Fox

●      LinkedIn

Connect with Jonathan Armstrong

●      Twitter

●      LinkedIn

Categories
31 Days to More Effective Compliance Programs

One Month to a More Effective Compliance Program Through Data Analytics: Day 10 – The Impact of Privacy Regulations on Compliance

What is the impact of privacy regulations on data-driven compliance? Every CCO must be aware of the importance of privacy in data-driven compliance and the challenges and tradeoffs involved in implementing effective compliance strategies. A key mandate is for CCOs and compliance professionals to have a compliance program that provides visibility into their data. This emphasizes the importance of having efficient and effective compliance solutions in place or as I have previously noted CCOs must have access to their compliance data literally at their fingertips.

This is one of the drivers for key trends shaping compliance technology in 2025 and beyond. The RegTech market is growing rapidly, and there is increased regulatory focus on cryptocurrency activities, ESG, and information security and cybersecurity. These trends indicate the evolving landscape of compliance and the need for organizations to stay updated and adapt their compliance strategies accordingly. By embracing connected compliance and leveraging technology, organizations can navigate the complex regulatory landscape and ensure compliance with privacy regulations while driving business efficiency.

 Three key takeaways:

  1. CCOs and compliance professionals must have a compliance program that provides visibility into their data.
  2. ESG regulations affect not only regulated industries but also any company holding private customer data or involved in large supply chains.
  3. By embracing connected compliance and leveraging technology, organizations can navigate the complex regulatory landscape and ensure compliance with privacy regulations while driving business efficiency.

For more on KonaAI, click here.

Categories
Life with GDPR

Life With GDPR – Lessons Learned from The Singtel Opus Data Breach

Tom Fox and Jonathan Armstrong, renowned experts in cyber security, co-host the award-winning Life with GDPR. In this episode, they look at litigation over a data breach against Singtel Opus in Australia and the fallout from an investigation report.

The recent data breach at Intel Optus, affecting 1.2 million individuals, has brought to light the critical role of strategic communication in managing cybersecurity breaches. Tom and Jonathan Armstrong, offer their unique perspectives on this issue. Fox emphasizes the inevitability of cybersecurity breaches and the need for a comprehensive strategy, including effective communication, to manage them. He warns against the potential consequences of mishandling communication during a breach, such as jeopardizing insurance coverage.

Armstrong highlights the complexity of maintaining privilege in a global corporate structure and the importance of careful language to avoid invalidating insurance or causing unnecessary speculation. He also underscores the need for a holistic approach to cybersecurity, encompassing prevention, detection, remediation, and crisis communication. Join Tom Fox and Jonathan Armstrong as they delve deeper into this topic in the latest Life with GDPR podcast episode.

  •  Key Takeaways:
  • Implications of Language in Data Breach Reporting
  • Navigating CEO Communication and Insurance Coverage
  • Navigating Insurance Coverage in Data Breaches

 Resources

For more information on the issues raised in this podcast, check out the Cordery Compliance, News Section. For more information on Cordery Compliance, go to their website here. Also check out the GDPR Navigator, one of the top resources for GDPR Compliance by clicking here. Check out the Cordery Data Breach Academy here.

Connect with Tom Fox:

Connect with Jonathan Armstrong:

●   Twitter

●   LinkedIn

Categories
Life with GDPR

Life With GDPR: WhatsApp Breach: Hospital’s GDPR Failures Exposed

Tom Fox and Jonathan Armstrong, renowned experts in cyber security, co-host the award-winning Life with GDPR. The recent controversy surrounding Nigel Farage’s banking situation highlights the risks and compliance challenges faced by the banking industry in relation to data protection. In this episode, Tom and Jonathan discuss a data breach in a Scottish hospital during the COVID-19 pandemic.

The breach occurred when hospital staff shared patient details on WhatsApp, raising concerns about GDPR compliance. The hospital informed the ICO about the breach but chose not to notify affected patients, highlighting the need for appropriate advice and support when making such decisions. The conversation also explores communication challenges in internal investigations and the privacy and security risks of platforms like WhatsApp. It emphasizes the importance of organizations adapting to the preferences of digital native employees and conducting data protection impact assessments. The podcast also highlights the importance of effective policies, training, and proactive phishing training to prevent cyber-attacks and protect sensitive information.

 

Key Takeaways:

  • Data breach in Scottish hospital
  • The Challenges of Communication in Internal Investigations
  • Importance of Policies and Training
  • Phishing Training Effectiveness

Resources

For more information on the issues raised in this podcast, check out the Cordery Compliance News Section. For more information on Cordery Compliance, go to their website here. Also, check out the GDPR Navigator, one of the top resources for GDPR Compliance, by clicking here.

Connect with Tom Fox

Connect with Jonathan Armstrong

Categories
Blog

AI and GDPR

Artificial Intelligence (AI) has revolutionized various industries, but with great power comes great responsibility. Regulators in the European Union (EU) are taking a proactive approach to address compliance and data protection issues surrounding AI and generative AI. Recent cases, such as Google’s AI tool, Bard, being temporarily suspended in the EU, have highlighted the urgent need for regulation in this rapidly evolving field. I recently had the opportunity to visit with GDPR maven Jonathan Armstrong on this topic. In this blog post, we will delve into our conversations about some of the key concerns raised about data and privacy in generative AI, the importance of transparency and consent, and the potential legal and financial implications for organizations that fail to address these concerns.

One of the key issues in the AI landscape is obtaining informed consent from users. The recent scrutiny faced by video conferencing platform Zoom serves as a stark reminder of the importance of transparency and consent practices. While there has been no official investigation into Zoom’s compliance with informed consent requirements, the company has retracted its initial statements and is likely considering how to obtain consent from users.

It is essential to recognize that obtaining consent extends not only to those who host a Zoom call but also to those who are invited to join the call. Unfortunately, there has been no on-screen warning about consent when using Zoom, leaving users in the dark about the data practices involved. This lack of transparency can lead to significant legal and financial penalties, as over 70% of GDPR fines involve a lack of transparency by the data controller.

Generative AI heavily relies on large pools of data for training, which raises concerns about copyright infringement and the processing of individuals’ data without consent. For instance, Zoom’s plan to use recorded Zoom calls to train AI tools may violate GDPR’s requirement of informed consent. Similarly, Getty Images has expressed concerns about its copyrighted images being used without consent to train AI models.

Websites often explicitly prohibit scraping data for training AI models, emphasizing the need for organizations to respect copyright laws and privacy regulations. Regulators are rightfully concerned about AI processing individuals’ data without consent or knowledge, as well as the potential for inaccurate data processing. Accuracy is a key principle of GDPR, and organizations using AI must conduct thorough data protection impact assessments to ensure compliance.

Several recent cases demonstrate the regulatory focus on AI compliance and transparency. In Italy, rideshare and food delivery applications faced investigations and suspensions for their AI practices. Spain has examined the use of AI in recruitment processes, highlighting the importance of transparency in the selection process. Google’s Bard case, similar to the Facebook dating case, faced temporary suspension in the EU due to the lack of a mandatory data protection impact assessment (DPIA).

It is concerning that many big tech providers fail to engage with regulators or produce the required DPIA for their AI applications. This lack of compliance and transparency poses significant risks for organizations, not just in terms of financial penalties but also potential litigation risks in the hiring process.

To navigate the compliance and data protection challenges posed by AI, organizations must prioritize transparency, fairness, and lawful processing of data. Conducting a data protection impact assessment is crucial, especially when AI is used in Know Your Customer (KYC), due diligence, and job application processes. If risks cannot be resolved or remediated internally, it is advisable to consult regulators and include timings for such consultations in project timelines.

For individuals, it is essential to be aware of the terms and conditions associated with AI applications. In the United States, informed consent is often buried within lengthy terms and conditions, leading to a lack of understanding and awareness. By being vigilant and informed, individuals can better protect their privacy and data rights.

As AI continues to transform industries, compliance and data protection must remain at the forefront of technological advancements. Regulators in the EU are actively addressing the challenges posed by AI and generative AI, emphasizing the need for transparency, consent, and compliance with GDPR obligations. Organizations and individuals must prioritize data protection impact assessments, engage with regulators when necessary, and stay informed about the terms and conditions associated with AI applications. By doing so, we can harness the power of AI while safeguarding our privacy and ensuring ethical practices in this rapidly evolving field.

Categories
Blog

The Importance of Effective Policies and Training in Data Protection: Lessons from a Scottish Hospital Breach

I recently had the chance to visit with Jonathan Armstrong on a recent data breach case that occurred in the health service provider NHS Lanarkshire (Scotland) during the COVID-19 pandemic. This breach serves as a stark reminder of the challenges organizations face in maintaining data protection and compliance, especially when it comes to communication platforms like WhatsApp. In this blog post we will explore the lessons learned from this incident and discuss practical advice for organizations to ensure robust data protection measures.

Background

According to the Cordery Compliance Client Alert on the matter, over a two-year period between 2020 and 2022, 26 staff at NHS Lanarkshire had access to a WhatsApp group where there were a minimum of 533 entries that included patient names. The information included 215 phone numbers, 96 with dates of birth and 28 included addresses. 15 images, 3 videos, and 4 screenshots were also shared, which included personal data of patients and clinical information, which is a “special category” health data under both EU and UK law. Other data to the WhatsApp group was also added in error. Other communications were also identified where the staff in question had used WhatsApp.

WhatsApp was not approved by NHS Lanarkshire for processing personal data of patients.  The use of WhatsApp was an approach adopted by the staff apparently without organizational knowledge. It was used by the staff as a substitute for communications that would have taken place in the clinical office but did not do so after staff reduced office attendance due to the COVID-19 pandemic. No Data Protection Impact Assessment was in place and no risk assessment relating to personal data processing was completed concerning WhatsApp, as WhatsApp was not approved by NHS Lanarkshire for the sharing of personal data relating to patients. NHS Lanarkshire undertook an internal investigation and reported this matter to the ICO.

ICO Holding

The UK ICO determined that NHS Lanarkshire did not have the appropriate policies, clear guidance and processes in place when WhatsApp was made available to download. Additionally,  there were a number of infringements of UK GDPR, not the least being not implementing appropriate technical and organizational measures (TOMs) to ensure the security of the personal data involved, as a consequence of which personal data was shared via an unauthorized means and an inappropriate disclosure occurred. There was also a failure to report this matter, as a data breach, to the ICO in time.

Armstrong noted that ICO recommended that NHS Lanarkshire should take action to ensure their compliance with data protection law, including:

  1. Considering implementing a secure clinical image transfer system, as part of NHS Lanarkshire’s exploration regarding the storage of images and videos within a care setting;
  2. Before deploying new apps, consideration of the risks relating to personal data and including the requirement to assess and mitigate these risks in any approval process;
  3. Ensuring that explicit communications, instructions or guidance are issued to employees on their data protection responsibilities when new apps are deployed;
  4. Reviewing all organizational policies and procedures relevant to this matter and amending them where appropriate; and,
  5. Ensuring that all staff are aware of their responsibilities to report personal data breaches internally without delay to the relevant team.

Armstrong concluded that “In light of the remedial steps and mitigating factors the ICO issued an official reprimand – a fine has not yet been imposed. The ICO also asked NHS Lanarkshire to provide an update of actions taken within six months of the reprimand being issued.”

Discussion

This case highlights the challenges organizations face when it comes to communication during internal investigations. In many instances, the most interesting documents are not found in emails, as one organization discovered. Employees often turn to alternative platforms like WhatsApp to avoid leaving a paper trail. However, it is crucial to understand that these platforms may not provide the expected privacy and security.

While platforms like WhatsApp may seem secure, they still share data with big tech companies, raising concerns about privacy. Organizations must adapt to the preferences of digital-native employees who may find email restrictive and opt for alternative communication methods. However, this adaptation should be done consciously, ensuring that policies and procedures are in place to protect sensitive information. Armstrong emphasizes the importance of revisiting emergency measures implemented during the pandemic. As remote work continues, organizations must conduct thorough data protection impact assessments to ensure compliance across all communication platforms and measures.

As with all types of compliance, setting policies and procedures is just the first step. It is essential to communicate and educate employees on these policies to ensure their understanding and compliance. Annual online training sessions are not enough; organizations should provide engaging training that goes beyond passive learning. In addition to targeted and effective training there must be ongoing communications provided to employees. Armstrong also related on the ineffectiveness of off-the-shelf online phishing training. Waiting for an incident to occur and then providing training is not enough to prevent people from clicking on malicious links. Organizations should focus on providing better training before incidents happen, rather than trying to enhance training afterwards.

The next step is monitoring as compliance with policies and procedures should be actively monitored. Technical solutions are available to help companies track compliance, but it’s crucial to involve individuals at all levels of the organization when designing these policies. Additionally, a balanced approach is needed, where employees are recognized for their service but also held accountable for policy breaches. The days of solely relying on punishment for enforcement are gone.

The data breach in the Scottish hospital serves as a wake-up call for organizations to prioritize data protection and compliance. Communication challenges during internal investigations, privacy concerns associated with alternative platforms, and the need for effective policies and training are crucial areas to address. By conducting regular data protection impact assessments, providing engaging training, and ensuring buy-in from employees, organizations can strengthen their defense against cyber threats and protect sensitive information. Always remember that compliance is an ongoing process, and continuous evaluation and improvement are necessary to adapt to the evolving digital landscape. Finally stay vigilant and proactive in safeguarding data privacy and protection.