AWS Updates Feed

← トップに戻る

AWS Updates - 2026-02-05

AWS What's New

AWS Builder ID now supports Sign in with Apple

AWS Builder ID, your profile for accessing AWS applications including AWS Builder Center, AWS Training and Certification, AWS re:Post, AWS Startups, and Kiro, now supports Sign in with Apple as a social login provider. This expansion of sign-in options builds on the existing Sign in with Google capability, providing Apple users with a streamlined way to access AWS resources without managing separate credentials on AWS.

With Sign in with Apple integration, developers and builders can now enjoy access to their AWS Builder ID profile using their Apple Account credentials. This enhancement eliminates password management complexity, reduces forgotten password issues, and provides a frictionless experience for both new user registration and returning user sign-ins. Whether you're accessing development resources in AWS Builder Center, enrolling in certification programs, participating in community discussions on AWS re:Post, exploring startup resources, or using Kiro to code your next app, your Apple Account now serves as a secure gateway to your builder AWS journey. 

To learn more about AWS Builder ID and get started with Sign in with Apple, visit the AWS Builder ID documentation.


Amazon EC2 High Memory U7i-6TB instances now available in AWS GovCloud (US-West)

Amazon EC2 High Memory U7i instances with 6TB of memory (u7i-6tb.112xlarge) are now available in AWS GovCloud (US-West). U7i instances are part of AWS 7th generation and are powered by custom fourth generation Intel Xeon Scalable Processors (Sapphire Rapids). U7i-6tb instances offer 6TiB of DDR5 memory, enabling customers to scale transaction processing throughput in a fast-growing data environment.

U7i-6tb instances offer 448 vCPUs, support up to 100Gbps Elastic Block Storage (EBS) for faster data loading and backups, deliver up to 100Gbps of network bandwidth, and support ENA Express. U7i instances are ideal for customers using mission-critical in-memory databases like SAP HANA, Oracle, and SQL Server.

To learn more about U7i instances, visit the High Memory instances page.


Claude Opus 4.6 now available in Amazon Bedrock

Starting today, Amazon Bedrock supports Claude Opus 4.6. According to Anthropic, Opus 4.6 is their most intelligent model and the world's best model for coding, enterprise agents, and professional work. Claude Opus 4.6 brings advanced capabilities to Amazon Bedrock customers, including industry-leading performance for agentic tasks, complex coding projects, and enterprise-grade workflows that require deep reasoning and reliability.

Claude Opus 4.6 excels across use cases that require sophisticated reasoning and multi-step orchestration. For agentic workflows, it manages complex tasks across dozens of tools with industry-leading reliability, proactively spinning up subagents and working with less oversight. Developers can leverage Opus 4.6’s coding capabilities for long-horizon projects, complex implementations, and large-scale codebases—handling the full lifecycle from requirements gathering to implementation and maintenance. Enterprise teams can use the model to power end-to-end workflows with professional polish, including financial analysis that surfaces insights requiring days of manual compilation, cybersecurity applications that catch subtle attack patterns, and computer use workflows that move data between applications. The model supports both 200K and 1M context tokens (preview), enabling processing of extensive documents and codebases.

Claude Opus 4.6 is now available in Amazon Bedrock. For the full list of available regions, refer to the documentation. To learn more and get started with the model in Amazon Bedrock, read the About Amazon blog and visit the Amazon Bedrock console.


Amazon EC2 G6e instances now available in Dubai region

Starting today, the Amazon EC2 G6e instances powered by NVIDIA L40S Tensor Core GPUs is now available in Middle East (UAE) Region. G6e instances can be used for a wide range of machine learning and spatial computing use cases.

Customers can use G6e instances to deploy large language models (LLMs) and diffusion models for generating images, video, and audio. Additionally, the G6e instances will unlock customers’ ability to create larger, more immersive 3D simulations and digital twins for spatial computing workloads. G6e instances feature up to 8 NVIDIA L40S Tensor Core GPUs with 48 GB of memory per GPU and third generation AMD EPYC processors. They also support up to 192 vCPUs, up to 400 Gbps of network bandwidth, up to 1.536 TB of system memory, and up to 7.6 TB of local NVMe SSD storage. 

Amazon EC2 G6e instances are available today in the AWS US East (N. Virginia, Ohio), US West (Oregon), Asia Pacific (Tokyo, Seoul), Middle East (UAE) and Europe (Frankfurt, Spain, Stockholm) Regions. Customers can purchase G6e instances as On-Demand Instances, Reserved Instances, Spot Instances, or as part of Savings Plans.

To get started, visit the AWS Management Console, AWS Command Line Interface (CLI), and AWS SDKs. To learn more, visit the G6e instance page.


AWS Builder ID now supports Sign in with Apple

AWS Builder ID, your profile for accessing AWS applications including AWS Builder Center, AWS Training and Certification, AWS re:Post, AWS Startups, and Kiro, now supports Sign in with Apple as a social login provider. This expansion of sign-in options builds on the existing Sign in with Google capability, providing Apple users with a streamlined way to access AWS resources without managing separate credentials on AWS.

With Sign in with Apple integration, developers and builders can now enjoy access to their AWS Builder ID profile using their Apple Account credentials. This enhancement eliminates password management complexity, reduces forgotten password issues, and provides a frictionless experience for both new user registration and returning user sign-ins. Whether you're accessing development resources in AWS Builder Center, enrolling in certification programs, participating in community discussions on AWS re:Post, exploring startup resources, or using Kiro to code your next app, your Apple Account now serves as a secure gateway to your builder AWS journey. 

To learn more about AWS Builder ID and get started with Sign in with Apple, visit the AWS Builder ID documentation.


Amazon WorkSpaces launches Graphics G6, Gr6, and G6f bundles

Today, Amazon WorkSpaces announces the availability of 12 new Graphics G6, Gr6, and G6f WorkSpaces bundles built on the Amazon EC2 G6 family. These bundles expand customers’ options for running graphics-intensive and GPU-accelerated workloads, and are available on both Amazon WorkSpaces Personal and Amazon WorkSpaces Core.

The new bundles are designed to support a wide range of performance, memory, and cost requirements: G6 bundles include five sizes with 1:4 vCPU-to-memory configurations, suitable for graphic design, CAD/CAM, and ML model training workloads. Gr6 bundles include two sizes with memory-optimized 1:8 vCPU-to-memory configurations, designed for higher-memory workloads such as 3D rendering, seismic visualization, and GIS processing. G6f bundles include five sizes and offer fractional GPU options (1/8, 1/4, and 1/2 GPU), enabling cost-effective access to GPU acceleration for workloads that do not require a full GPU. All Graphics G6, Gr6, and G6f WorkSpaces support Windows Server 2022 and allow customers to bring their own Windows desktop licenses for Windows 11.

These bundles are available in 13 AWS Regions: US East (N. Virginia), US West (Oregon), Canada (Central), Europe (Paris, Frankfurt, London), Asia Pacific (Tokyo, Mumbai, Sydney, Seoul), South America (São Paulo), and AWS GovCloud (US-West and US-East).

To get started, create a Graphics G6, Gr6, or G6f WorkSpace using the Amazon WorkSpaces console. For pay-as-you-go pricing details, see the Amazon WorkSpaces Pricing Page and the Amazon WorkSpaces Core Pricing Page.


Amazon EC2 I7ie instances now available in AWS Canada (Central)

AWS is announcing Amazon EC2 I7ie instances are now available in AWS Canada (Central) regions. Designed for large storage I/O intensive workloads, I7ie instances are powered by 5th Gen Intel Xeon Processors with an all-core turbo frequency of 3.2 GHz, offering up to 40% better compute performance and 20% better price performance over existing I3en instances. I7ie instances offer up to 120TB local NVMe storage density (highest in the cloud) for storage optimized instances and offer up to twice as many vCPUs and memory compared to prior generation instances. Powered by 3rd generation AWS Nitro SSDs, I7ie instances deliver up to 65% better real-time storage performance, up to 50% lower storage I/O latency, and 65% lower storage I/O latency variability compared to I3en instances.

I7ie are high density storage optimized instances, ideal for workloads requiring fast local storage with high random read/write performance at very low latency consistency to access large data sets. These instances are available in 9 different virtual sizes and deliver up to 100Gbps of network bandwidth and 60Gbps of bandwidth for Amazon Elastic Block Store (EBS).

To learn more, visit the I7ie instances page.


Amazon EC2 capacity blocks for ML can be shared across multiple accounts

Amazon Web Services (AWS) is announcing the general availability of cross-account sharing for Amazon EC2 Capacity Blocks for ML. This capability allows organizations to share reserved GPU capacity across AWS accounts using AWS Resource Access Manager (RAM), helping optimize utilization and reduce costs.

Organizations can now purchase Capacity Blocks and provision them across multiple accounts, allowing different workloads to access a pool of reserved capacity at no additional cost. This capability helps teams coordinate ML infrastructure investments and keeps reserved GPU capacity in continuous use across different workloads.

This feature is available for all Instance Capacity Blocks in AWS Regions where EC2 Capacity Blocks for ML are offered. For a complete list of supported regions, refer to Capacity Blocks Supported Regions documentation

To get started, create a Resource Share through AWS Resource Access Manager, add your Capacity Blocks for ML resources, and specify the target accounts you wish to share with. For more details, please refer to the Capacity Block Guide


AWS Glue launches native REST API connector for universal data integration

AWS Glue now offers a native REST-based connector that enables customers to easily read data from any source with a REST-based API. Customers can now create custom connectors to any REST-enabled data source and seamlessly integrate that data into their AWS Glue ETL (Extract, Transform, and Load) jobs. This capability extends AWS Glue's existing connectivity to 100+ non-AWS data sources through 60+ native connectors and additional options on AWS Marketplace.

Previously, connecting to proprietary systems or emerging platforms required customers to build custom connectors by providing specialized JARs with the necessary libraries. The new native REST API connector eliminates this complexity, making it easier to integrate data from any REST-enabled source. It reduces operational overhead by eliminating the need to install, update, or manage custom libraries, freeing teams from maintenance burdens. The connector also enhances flexibility, enabling organizations to quickly adapt to new data sources as business needs evolve. It also streamlines ETL management by allowing data engineers to focus on data transformation and business logic rather than spending time building and maintaining connector infrastructure.

The AWS Glue REST API connector is available in all AWS commercial regions where AWS Glue is available.

You can start using the AWS Glue REST API connector using AWS Glue APIs, AWS Command Line Interface (CLI), or AWS Software Development Kit (SDK). To get started, see AWS Glue documentation.


Amazon EC2 G6e instances now available in the UAE region

Starting today, the Amazon EC2 G6e instances powered by NVIDIA L40S Tensor Core GPUs is now available in Middle East (UAE) Region. G6e instances can be used for a wide range of machine learning and spatial computing use cases.

Customers can use G6e instances to deploy large language models (LLMs) and diffusion models for generating images, video, and audio. Additionally, the G6e instances will unlock customers’ ability to create larger, more immersive 3D simulations and digital twins for spatial computing workloads. G6e instances feature up to 8 NVIDIA L40S Tensor Core GPUs with 48 GB of memory per GPU and third generation AMD EPYC processors. They also support up to 192 vCPUs, up to 400 Gbps of network bandwidth, up to 1.536 TB of system memory, and up to 7.6 TB of local NVMe SSD storage. 

Amazon EC2 G6e instances are available today in the AWS US East (N. Virginia, Ohio), US West (Oregon), Asia Pacific (Tokyo, Seoul), Middle East (UAE) and Europe (Frankfurt, Spain, Stockholm) Regions. Customers can purchase G6e instances as On-Demand Instances, Reserved Instances, Spot Instances, or as part of Savings Plans.

To get started, visit the AWS Management Console, AWS Command Line Interface (CLI), and AWS SDKs. To learn more, visit the G6e instance page.