Rewterz Threat Alert – Payment Card-Skimming Campaign Expands Focus to North American Websites
September 20, 2023Rewterz Threat Update – Cybercriminals Claim to Offer Data of 2 Million Pakistani Restaurant Customers for Sale
September 21, 2023Rewterz Threat Alert – Payment Card-Skimming Campaign Expands Focus to North American Websites
September 20, 2023Rewterz Threat Update – Cybercriminals Claim to Offer Data of 2 Million Pakistani Restaurant Customers for Sale
September 21, 2023Severity
High
Analysis Summary
Microsoft has acknowledged a significant security breach that exposed 38 terabytes of private data, which was discovered on the company’s AI GitHub repository. This data leak occurred when a bucket of open-source training data was inadvertently made public. It included sensitive information such as secrets, keys, passwords, and over 30,000 internal Microsoft Teams messages, originating from two former employees’ workstations.
The repository, known as “robust-models-transfer,” is currently inaccessible. Before it was removed, it contained source code and machine learning models related to a research paper from 2020 titled “Do Adversarially Robust ImageNet Models Transfer Better?”
Microsoft connected the data leak to using a Shared Access Signature (SAS) token that requires an excessive permission. It allowed complete control over the shared files. This Azure feature is very hard to monitor and revoke.
The problem was exacerbated by the repository’s README.md file, which instructed developers to download the models from an Azure Storage URL that accidentally granted access to the entire storage account, exposing additional private data. The SAS token was also misconfigured to allow “full control” permissions, enabling attackers to not only view but also delete and overwrite files in the storage account.
Microsoft’s investigation found no evidence of unauthorized exposure of customer data, and no other internal services were compromised. The company revoked the SAS token, blocked external access to the storage account, and resolved the issue within two days of responsible disclosure.
To prevent similar incidents in the future, Microsoft has expanded its secret scanning service to include overly permissive SAS tokens and identified a bug in its scanning system that flagged the SAS URL in the repository as a false positive.
“Due to the lack of security and governance over Account SAS tokens, they should be considered as sensitive as the account key itself. Therefore, it is highly recommended to avoid using Account SAS for external sharing. Token creation mistakes can easily go unnoticed and expose sensitive data”, mentioned researchers.
The researchers emphasized the importance of treating Account SAS tokens as sensitive as account keys themselves and recommended avoiding their use for external sharing due to the potential for token creation mistakes that could expose sensitive data.
This incident highlights the need for enhanced security measures as data scientists and engineers handle vast amounts of data for AI development. It also underscores the challenges of securing such data in collaborative and open-source environments.
Researchers reported the issue to the Microsoft Security Response Center (MSRC) on June 22nd, which resulted in the revocation of the SAS token in order to block all external access to the Azure storage account. They said that no customer data was leaked and neither any internal services were disrupted due to this incident.
“We appreciate the opportunity to investigate the findings reported by Wiz.io. We encourage all researchers to work with vendors under Coordinated Vulnerability Disclosure (CVD) and abide by the rules of engagement for penetration testing to avoid impacting customer data while conducting security research”, they conclude
Impact
- Sensitive Data Theft
- Credential Theft
Remediation
- Immediately revoke the compromised SAS token to block unauthorized access to the affected storage resources.
- Conduct a thorough investigation to understand the extent of the data exposure, including what data was accessed and potentially compromised.
- Perform a comprehensive security audit of the Azure environment to identify vulnerabilities and misconfigurations that led to the breach.
- Review and assess the configuration settings of all SAS tokens to ensure they adhere to best practices and the principle of least privilege.
- Treat SAS tokens as sensitive secrets and implement secure secret management practices, such as using a dedicated secrets management solution or service.
- Implement rigorous access controls and access management policies for SAS tokens, ensuring that they are only shared with authorized clients or applications.
- Schedule periodic audits of SAS tokens and their configurations to ensure ongoing compliance with security best practices.
- Provide training and awareness programs to educate employees and developers about the importance of secure SAS token management and the risks associated with misconfigurations.
- Implement automated scanning tools that can identify SAS tokens with overly permissive settings or privileges. Set up alerts to notify administrators when such issues are detected.
- Maintain a well-defined incident response plan that outlines the steps to take in the event of a security incident. Ensure that the plan is regularly tested and updated.
- Here are the recommendations provided by Azure Storage for working with Shared Access Signature (SAS) URLs:
- Limit the scope of SAS URLs to the smallest set of resources required by clients, such as a single blob. Also, restrict permissions to only those needed by the application.
- Always set a near-term expiration time when creating a SAS. Azure Storage recommends an expiration time of 1 hour or less for all SAS URLs.
- Treat SAS URLs as valuable application secrets. Only share SAS URLs with clients who genuinely require access to the associated storage account. Minimize exposure to reduce security risks.
- Associate SAS tokens with a Stored Access Policy when applicable. This enables fine-grained revocation of a SAS within a container. Be prepared to remove the Stored Access Policy or rotate storage account keys if a SAS or Shared Key becomes compromised to prevent unauthorized access.
- Implement monitoring and auditing for your application’s interactions with the storage account. Enable Azure Monitor and Azure Storage Logs to track how requests are authorized. Utilize a SAS Expiration Policy to identify clients using long-lived SAS URLs, which can help in detecting potential security risks or breaches.