Exposed long-term credentials continue to be the top entry point used by threat actors in security incidents observed by the AWS Customer Incident Response Team (CIRT). The exposure and subsequent use of long-term credentials or access keys by threat actors poses security risks in cloud environments. Additionally, poor key rotation practices, sharing of access keys […]
If you’ve ever spent hours manually digging through AWS CloudTrail logs, checking AWS Identity and Access Management (IAM) permissions, and piecing together the timeline of a security event, you understand the time investment required for incident investigation. Today, we’re excited to announce the addition of AI-powered investigation capabilities to AWS Security Incident Response that automate […]
Bulletin ID: AWS-2025-029
Scope: AWS
Content Type: Important (requires attention)
Publication Date: 2025/11/21 12:15 PM PDT
Description:
AWS Wickr is an end-to-end encrypted service that helps organizations communicate securely through messaging, voice and video calling, file sharing, and screen sharing.
We identified CVE-2025-13524, which describes an issue in the Wickr calling service. Under certain conditions, which require the affected user to take a particular action within the application, the user’s audio stream remains open after they close their call window. This could result in audio from the affected user’s device continuing to stream unexpectedly to other call participants until those users drop the call, the affected user joins another call, or the affected user terminates their application.
Impacted versions:
AWS Wickr, Wickr Gov and Wickr Enterprise desktop (Windows, Mac and Linux) versions prior to 6.62.13.
Modern generative AI applications often need to stream large language model (LLM) outputs to users in real-time. Instead of waiting for a complete response, streaming delivers partial results as they become available, which significantly improves the user experience for chat interfaces and long-running AI tasks. This post compares three serverless approaches to handle Amazon Bedrock LLM streaming on Amazon Web Services (AWS), which helps you choose the best fit for your application.
Today, we announced Amazon API Gateway REST API’s support for private integration with Application Load Balancers (ALBs). You can use this new capability to securely expose your VPC-based applications through your REST APIs without exposing your ALBs to the public internet.
Event-driven applications often need to process data in real-time. When you use AWS Lambda to process records from Apache Kafka topics, you frequently encounter two typical requirements: you need to process very high volumes of records in close to real-time, and you want your consumers to have the ability to scale rapidly to handle traffic spikes. Achieving both necessitates understanding how Lambda consumes Kafka streams, where the potential bottlenecks are, and how to optimize configurations for high throughput and best performance.
In this post, you will learn how the new Amazon API Gateway’s enhanced TLS security policies help you meet standards such as PCI DSS, Open Banking, and FIPS, while strengthening how your APIs handle TLS negotiation. This new capability increases your security posture without adding operational complexity, and provides you with a single, consistent way to standardize TLS configuration across your API Gateway infrastructure.