April 25, 2022

Top 20 AWS VPC Interview Questions and Answers

 

VPC (Virtual Private Cloud) is one of the AWS services that is gaining traction in the tech employment market these days. Knowing the fundamentals of VPC might provide job seekers who want to work for Amazon Web Services an advantage. It is our responsibility to prepare you for this. As a result, we've compiled a list of the finest AWS VPC interview questions that frequently appear in AWS interviews. Before we get into that, let's go over some of the fundamentals of this technology that a newbie should be aware of while taking AWS training.

As most of you are aware, Amazon Web Services (AWS) is an Amazon subsidiary that offers cloud computing services based on user demand. Users must pay a monthly subscription fee. Amazon offers a variety of services that allow you to effortlessly integrate your local resources with the cloud. AWS S3 (Simple Storage Service) is an Amazon Web Services (AWS) service that offers object storage using several web service interfaces such as SOAP, BitTorrent, and others. Knowing how to respond to common AWS interview questions can give you an advantage over other candidates vying for a spot on the AWS team.


AWS(Amazon Web Services) Interview Questions and Answers

AWS Cloud Interview Questions and Answers


Ques. 1): Is there a limit to how many VPCs, VPNs, Subnets, and Gateways I can create?

Answer:

Those things are unquestionably constrained in their production. In a single region, you can only construct five VPCs. If you want to increase the limit, you'll need to increase the internet gateway as well.

VPNs, elastic IP addresses, NAT gateways, and internet gateways all have a maximum limit of five. The maximum number of subnets per VPC is 200.

Furthermore, there is a maximum of 50 customer portals per area.


AWS RedShift Interview Questions and Answers


Ques. 2): What Is It That Sets AWS VPC Apart From Other Private Clouds?

Answer:

The following two qualities distinguish AWS VPC from other cloud computing services:

When you need a private network in the cloud, it eliminates the need to set up and manage physical data centres, hardware, and/or virtual private networks.

AWS VPC is extremely secure against security and privacy threats because to its comprehensive security measures.


AWS Cloud Practitioner Essentials Questions and Answers


Ques. 3): What exactly is the meaning of the phrase "VPC"?

Answer:

VPC stands for Virtual Private Cloud, and it's a private network space within the Amazon cloud where you can deploy AWS resources. It's Amazon EC2's actual networking layer, which we've already talked about. Each virtual network in the cloud that you construct will be logically separated from other virtual networks in the cloud.

Although the layout of a VPC is similar to that of a typical network in a data centre, a VPC will benefit from AWS's scalable architecture. Another significant benefit of VPC is that it is completely customizable. You can create subnets, set up root tables, configure network gateways, setup network access control lists, choose IP address range, and many more in a Virtual Private Cloud.


AWS EC2 Interview Questions and Answers


Ques. 4): What is a Network Address Translation (NAT) Device?

Answer:

In your VPC, a NAT device will allow instances in the private subnet to send outward IPv4 traffic to other AWS services/the internet while preventing inbound traffic from the internet. When traffic is sent to the internet, the IP address is replaced by the address of the NAT device, and when the response is returned to the instances, the device translates the instances' addresses back to private IP addresses. There are two types of NAT devices available on AWS: NAT instance and NAT gateway. NAT instances are configured on Linux AMIs. IPv6 is not supported by NAT.


AWS Lambda Interview Questions and Answers


Ques. 5): What Are My Vpc's Connectivity Options?

Answer:

You can link your VPC to the following resources:

  • The World Wide Web (via an Internet gateway)
  • Using a Hardware VPN connection to access to your business data centre (via the virtual private gateway)
  • The Internet as well as your company's data centre (utilizing both an Internet gateway and a virtual private gateway)
  • AWS's other services (via Internet gateway, NAT, virtual private gateway, or VPC endpoints)
  • Other Virtual Private Clouds (via VPC peering connections)


AWS Cloud Security Interview Questions and Answers


Ques. 6): Is it possible to use Amazon VPC with Amazon Ec2 Reserved Instances?

Answer:

Yes. When you buy Reserved Instances, you can reserve an instance in Amazon VPC. AWS does not distinguish between instances running on Amazon VPC and normal Amazon EC2 when calculating your charge. AWS optimises which instances are charged at the reduced Reserved Instance rate, ensuring you pay the least amount possible. However, your instance reservation will be specific to Amazon VPC; for more information, visit the Reserved Instances page.


AWS Simple Storage Service (S3) Interview Questions and Answers


Ques. 7): Is it possible for Amazon Ec2 instances within a Vpc to communicate with Amazon Ec2 instances outside of the Vpc?

Answer:

Yes, it is correct. If an Internet gateway is set up, Amazon VPC traffic destined for Amazon EC2 instances outside of a VPC passes through the Internet gateway before entering the public AWS network to reach the EC2 instance. The traffic traverses the VPN connection, egresses from your datacenter, and then re-enters the public AWS network if an Internet gateway has not been established, or if the instance is in a subnet configured to route through the virtual private gateway.


AWS Fargate Interview Questions and Answers


Ques. 8): What is ELB (Elastic Load Balancing) and how does it effect Virtual Private Cloud?

Answer:

ELB is a load balancer service for AWS deployments, as the name implies. A load balancer spreads the amount of work that a computer must complete into other computers, allowing it to be completed faster. ELB distributes incoming application traffic to numerous destinations, such as EC2 instances, in the same way.

There are three types of ELBs that assure scalability, availability, and security for fault-tolerant applications. There are three types of load balancers: traditional, network, and application load balancers. VPC can be used in conjunction with network and application load balancers, which can route traffic to targets within VPCs.


AWS SageMaker Interview Questions and Answers


Ques. 9): What Are The Amazon Vpc Components?

Answer:

Amazon VPC comprises a variety of objects that will be familiar to customers with existing networks:

  • A Virtual Private Cloud (VPC): A logically isolated virtual network in the AWS cloud. You define a VPC’s IP address space from a range you select.
  • Subnet: A segment of a VPC’s IP address range where you can place groups of isolated resources.
  • Internet Gateway: The Amazon VPC side of a connection to the public Internet.
  • NAT Gateway: A highly available, managed Network Address Translation (NAT) service for your resources in a private subnet to access the Internet.
  • Hardware VPN Connection: A hardware-based VPN connection between your Amazon VPC and your datacenter, home network, or co-location facility.
  • Virtual Private Gateway: The Amazon VPC side of a VPN connection.
  • Customer Gateway: Your side of a VPN connection.
  • Router: Routers interconnect subnets and direct traffic between Internet gateways, virtual private gateways, NAT gateways, and subnets.
  • Peering Connection: A peering connection enables you to route traffic via private IP addresses between two peered VPCs.
  • VPC Endpoint for S3: Enables Amazon S3 access from within your VPC without using an Internet gateway or NAT, and allows you to control the access using VPC endpoint p
  • LI>Egress-only Internet Gateway: A stateful gateway to provide egress only access for IPv6 traffic from the VPC to the Internet.


AWS DynamoDB Interview Questions and Answers


Ques. 10): In a VPC, what IP address range can be used?

Answer:

For the principal CIDR block, you can use any IPv4 address range, including RFC 1918 or publicly routable IP ranges. Certain restrictions apply to secondary CIDR blocks. Publicly routable IP blocks can only be reached via the Virtual Private Gateway and cannot be reached via the Internet gateway. Customer-owned IP address blocks are not advertised on the Internet by AWS. Call the necessary API or use the AWS Management Console to assign an Amazon-provided IPv6 CIDR block to a VPC.


AWS Cloudwatch interview Questions and Answers


Ques. 11): What Is The Difference Between A Vpc's Security Groups And Network Acls?

Answer:

A VPC's security groups define which communication is permitted to and from an Amazon EC2 instance. Network ACLs assess traffic entering and exiting a network at the subnet level. Allow and Deny rules can be set using network ACLs. Traffic between instances in the same subnet is not filtered by network ACLs. Furthermore, network ACLs filter in a stateless manner, whereas security groups filter in a stateful manner.


AWS Elastic Block Store (EBS) Interview Questions and Answers


Ques. 12): You Really Want To Use My Ec2 Account's Default Vpc? Is that even conceivable?

Answer:

Yes, but we can only enable an existing account for a default VPC if that account has no EC2-Classic resources in that region. All non-VPC deployed Elastic Load Balancers, Amazon RDS, Amazon ElastiCache, and Amazon Redshift resources in that region must also be terminated. All future resource launches, including instances created via Auto Scaling, will be placed in your default VPC after your account has been configured for a default VPC. Contact AWS Support to get your existing account set up with a default VPC. To see if you're eligible for a default VPC, we'll look at your request as well as your existing AWS services and EC2-Classic presence.


AWS Amplify Interview Questions and Answers


Ques. 13): What Is The Best Way To Tell If My Account Is Set To Use A Default Vpc?

Answer:

The Amazon EC2 console shows you which platforms you can use to launch instances in the selected region, as well as whether you have a default VPC there. In the navigation bar, make sure the region you'll be using is selected. Look under "Account Attributes" on the Amazon EC2 console dashboard for "Supported Platforms." If both EC2-Classic and EC2-VPC are present, you can start instances on either platform. You can only launch instances into EC2-VPC if there is only one value, EC2-VPC. If your account is configured to use a default VPC, your default VPC ID will be presented under "Account Attributes". You can also use the EC2 DescribeAccountAttributes API or CLI to describe your supported platforms.


AWS Secrets Manager Interview Questions and Answers


Ques. 14): How to build a custom VPC?

Answer:

In order to build a custom VPC, the following steps must be followed:

  • Create a Virtual Private Cloud
  • Then create Subnets
  • Further create an Internet Gateway
  • Attach this new Gateway to your VPC
  • Create a new Route Table
  • Add the gateway as a route to the new route table
  • Add a subnet to the route table’s subnet association
  • Create a web server for public subnet and a database server for the private subnet
  • Create a new security group for the NAT
  • Add HTTP and HTTPS inbound rules that let in traffic from the private subnets IP
  • Create a NAT for public subnet
  • Create an elastic IP
  • Associate this IP to the NAT
  • Disable destination/source checks for the NAT
  •  Add NAT to the initial VPC route table as a route.


Top 20 AWS Django Interview Questions and Answers


Ques. 15): When it comes to filtering, what's the difference between stateful and stateless?

Answer:

Stateful filtering keeps track of the origin of a request and can send the response back to the original machine automatically. A stateful filter that enables inbound traffic to TCP port 80 on a web server, for example, will allow return traffic on a higher-numbered port (e.g., destination TCP port 63, 912) to pass through the stateful filter between the client and the webserver. The filtering device keeps track of the origin and destination port numbers and IP addresses in a state table. On the filtering device, only one rule is required: Allow inbound traffic on TCP port 80 to the web server.

Stateless filtering, on the other hand, merely looks at the source or destination IP address, as well as the destination port, regardless of whether the traffic is a new request or a response to a request. In the case above, the filtering device would need to implement two rules: one to allow traffic incoming to the web server on TCP port 80, and another to allow traffic outward from the web server (TCP port range 49, 152 through 65, 535).


AWS Cloud Support Engineer Interview Question and Answers


Ques. 16): What is Classiclink, exactly?

Answer:

VPC (Virtual Private Cloud) by Amazon (VPC) ClassicLink allows EC2 instances running on the EC2-Classic platform to communicate with VPC instances through private IP addresses. To use ClassicLink, you must first enable it for a VPC in your account and then link a Security Group from that VPC to an EC2-Classic instance. All of your VPC Security Group's policies will apply to communications between EC2-Classic instances and VPC instances.


AWS Solution Architect Interview Questions and Answers


Ques. 17): What is the best way to link a VPC to my corporate datacenter?

Answer:

By establishing a hardware VPN connection between your existing network and Amazon VPC, you can communicate with Amazon EC2 instances within a VPC as if they were on your local network. On Amazon EC2 instances in a VPC accessible via a hardware VPN connection, AWS does not execute network address translation (NAT).


AWS Glue Interview Questions and Answers


Ques. 18): How do I specify the Availability Zone in which my Amazon EC2 instances will be launched?

Answer:

When you create an Amazon EC2 instance, you must provide the subnet on which the instance will run. The instance will be deployed in the Availability Zone that corresponds to the subnet given.


AWS Aurora Interview Questions and Answers


Ques. 19): Why can't you ping the router that joins my subnets, or my default gateway?

Answer:

Ping (ICMP Echo Request and Echo Reply) requests to your VPC's router are not supported. Pinging between Amazon EC2 instances within a VPC is possible if your operating system's firewalls, VPC security groups, and network ACLs allow it.


AWS DevOps Cloud Interview Questions and Answers


Ques. 20): Is It Possible To Control And Manage Amazon Vpc Using The AWS Management Console?

Answer:

Yes, it is correct. VPCs, subnets, route tables, Internet gateways, and IPSec VPN connections can all be managed through the AWS Management Console. You can also construct a VPC with the help of a simple wizard.

AWS RDS Interview Questions and Answers

 


April 21, 2022

Top 20 AWS Glue Interview Questions and Answers

  

              

              AWS Glue is a serverless data integration tool that makes finding, preparing, and combining data for analytics, machine learning, and application development a breeze. AWS Glue has all of the data integration features you'll need, so you can start analysing and using your data in minutes rather than months. To make data integration easier, AWS Glue offers both visual and code-based interfaces. The AWS Glue Data Catalog allows users to quickly locate and retrieve data. With a few clicks in AWS Glue Studio, data engineers and ETL (extract, transform, and load) developers can graphically construct, run, and monitor ETL workflows. Data analysts and scientists can use AWS Glue DataBrew to visually enrich, clean, and standardise data without having to write code. Application developers may utilise Structured Query Language (SQL) to mix and replicate data across disparate data stores with AWS Glue Elastic Views.

 

AWS Cloud Practitioner Essentials Questions and Answers

Amazon Web Services Interview Questions and Answers

AWS Cloud Interview Questions and Answers

 

Ques. 1): What are your thoughts on AWS Glue?

Answer:

AWS Glue is a service that makes categorising, cleaning, and reliably moving data across various data stores and data streams simple and cost effective.

  • It comprises of the SWA Glue Catalog, a central metadata      repository.
  • By handling dependency resolution, task monitoring, and retries, AWS Glue assists in the generation of Python or Scala code.
  • AWS Glue is a serverless infrastructure that is easy to set up and manage, and it has a dynamic frame component that we can use in our ETL scripts.
  • Dynamic Frame is the same as an Apache Spark dataframe and is a data abstraction for organising data into rows and columns.

 

AWS EC2 Interview Questions and Answers

Amazon EMR Interview Questions and Answers

Amazon OpenSearch Interview Questions and Answers

 

Ques. 2): Which Data Stores Can I Crawl using Glue?

Answer:

Crawlers can crawl both file-based and table-based data stores. Crawlers can crawl the following data stores through their respective native interfaces:
Crawlers can crawl the following data stores through a JDBC connection:
  • Amazon Redshift
  • Amazon Relational Database Service (Amazon RDS)
  • Amazon Aurora
  • Microsoft SQL Server
  • MySQL
  • Oracle
  • PostgreSQL
  • Publicly accessible databases

 

AWS RedShift Interview Questions and Answers

AWS FinSpace Interview Questions and Answers

AWS MSK Interview Questions and Answers

 

Ques. 3): What components does AWS Glue make use of?

Answer:

AWS Glue is made up of the following ingredients:

  • Data Catalog is a Metadata Repository on the Cloud.
  • The ETL Engine assists with the generation of Python and Scala code.
  • Flexible Scheduler aids in the resolution of dependencies, job monitoring, and retring.
  • AWS Glue DataBrew provides a visual interface for normalizing and cleaning data.
  • Replicating and combining data across various data stores with AWS Glue Elastic View.

 

AWS Lambda Interview Questions and Answers

AWS Simple Notification Service (SNS) Interview Questions and Answers

AWS QuickSight Interview Questions and Answers


Ques. 4): What is AWS Glue DataBrew, and how does it work?

Answer:

AWS Glue DataBrew is a visual data preparation solution that allows data analysts and scientists to prepare data without writing code using an interactive, point-and-click visual interface. You can easily view, clean, and normalise terabytes, if not petabytes, of data directly from your data lake, data warehouses, and databases, including Amazon S3, Amazon Redshift, Amazon Aurora, and Amazon RDS, using Glue DataBrew. AWS Glue DataBrew is now broadly available in the US East (North Carolina), US East (Ohio), US West (Oregon), EU (Ireland), EU (Frankfurt), Asia Pacific (Sydney), and Asia Pacific (Frankfurt) regions (Tokyo).

 

AWS Cloud Security Interview Questions and Answers

AWS SQS Interview Questions and Answers

AWS AppFlow Interview Questions and Answers

 

Ques. 5): What steps do I need to take to get my metadata into the AWS Glue Data Catalog?

Answer:

There are several ways to populate metadata into the AWS Glue Data Catalog with AWS Glue. Glue crawlers automatically deduce schemas and partition structure from various data sources you control, populating the Glue Data Catalog with corresponding table definitions and statistics. You can also schedule crawlers to run on a regular basis to keep your metadata current and in sync with the underlying data. Alternatively, you can use the AWS Glue Console or the API to manually add and change table details. On an Amazon EMR cluster, you can also run Hive DDL statements via the Amazon Athena Console or a Hive client. Finally, if you already have a persistent Apache Hive Metastore, you can perform a bulk import of that metadata into the AWS Glue Data Catalog by using our import script.

 

AWS Simple Storage Service (S3) Interview Questions and Answers

AWS QLDB Interview Questions and Answers

AWS STEP Functions Interview Questions and Answers

 

Ques. 6): What steps does AWS Glue take to deduplicate my data?

Answer:

The FindMatches ML Transform in AWS Glue makes it simple to locate and link records that refer to the same entity but lack a unique identifier. Before FindMatches, data-matching problems were usually solved deterministically by constructing a large number of hand-tuned rules. Behind the scenes, FindMatches employs machine learning algorithms to learn how to match records based on each developer's specific business criteria. FindMatches first selects records for the client to categorise as matching or not matching, and then creates an ML Transform using machine learning.

 

AWS Fargate Interview Questions and Answers

Amazon Managed Blockchain Questions and Answers

AWS Message Queue(MQ) Interview Questions and Answers

 

Ques. 7): To use AWS Glue DataBrew, do I need to use AWS Glue Data Catalog or AWS Lake Formation?

Answer:

No. You don't need the AWS Glue Data Catalog or AWS Lake Formation to use AWS Glue DataBrew. If you utilise either the AWS Glue Data Catalog or AWS Lake Formation, DataBrew users can choose from a centralised data catalogue of data sets available to them.

 

AWS SageMaker Interview Questions and Answers

AWS Serverless Application Model(SAM) Interview Questions and Answers

AWS X-Ray Interview Questions and Answers

 

Ques. 8): What are the benefits of using AWS Glue Schema Registry?

Answer:

  • Validate schemas using the AWS Glue Schema Registry. Schemas used for data production are checked against schemas in a central registry when data streaming apps are linked with AWS Glue Schema Registry, allowing you to centrally regulate data quality.
  • Maintain the evolution of the schema. One of eight compatibility modes can be used to specify criteria for how schemas can and cannot grow.
  • Improve the quality of your data. Serializers compare data producers' schemas to those in the registry, enhancing data quality at the source and avoiding downstream difficulties caused by unexpected schema drift.
  • Save money. Serializers transform data into a binary format, which can then be compressed before being provided, lowering data transit and storage costs.
  • Increase the speed of processing. A data stream often contains records with multiple schemas. The Schema Registry allows applications that read data streams to choose process each record based on the schema rather than parsing its contents, improving processing efficiency.

 

AWS DynamoDB Interview Questions and Answers

AWS Wavelength Interview Questions and Answers

AWS Outposts Interview Questions and Answers


Ques. 9): What are the benefits of using AWS Glue Elastic Views?

Answer:

To aggregate and constantly replicate data across various data stores in near-real time, you should use AWS Glue Elastic Views. This is often the case when developing new application functionality that requires access to data from one or more current data stores. An organisation might, for example, utilise a customer relationship management (CRM) programme to keep track of client connections and an e-commerce website to conduct online transactions. These applications would store data in one or more data stores. The organisation is now developing a new bespoke application that generates and presents unique offers to active website visitors. This programme accomplishes this by combining customer data from the CRM application with online clickstream data from the e-commerce application. A developer can create new functionality in three phases using AWS Glue Elastic Views. First, they use AWS Glue Elastic Views to connect the CRM and e-commerce application data stores. Then, using SQL, they choose the appropriate data from the CRM and e-commerce data databases. Finally, they connect the data storage of the custom application to the results.

 

AWS Cloudwatch interview Questions and Answers

AWS Lightsail Questions and Answers

AWS Keyspaces Interview Questions and Answers 


Ques. 10): Which AWS services and open source projects make use of AWS Glue Data Catalog?

Answer:

Following are the AWS services and open source projects that make use of the AWS Glue Data Catalog include:

 

AWS Elastic Block Store (EBS) Interview Questions and Answers

AWS ElastiCache Interview Questions and Answers

AWS ECR Interview Questions and Answers


Ques. 11): When should I employ a Glue Classifier?

Answer:

When you crawl a data store to define metadata tables in the AWS Glue Data Catalog, you employ classifiers. You can use an ordered set of classifiers to set up your crawler. When a crawler calls a classifier, the classifier determines whether or not the data has been identified. If the first classifier fails to recognise the data or is unsure, the crawler moves on to the next classifier in the list to see if it can recognise the data.

 

AWS Amplify Interview Questions and Answers

AWS DocumentDB Interview Questions and Answers

AWS EC2 Auto Scaling Interview Questions and Answers


Ques. 12): How can I connect to the AWS Glue Schema Registry in a secure manner?

Answer:

By configuring an interface VPC endpoint for AWS Glue, you may use AWS PrivateLink to link your data producer's VPC to AWS Glue. Communication between your VPC and AWS Glue occurs entirely within the AWS network when you use a VPC interface endpoint.

 

AWS Cloud Interview Questions and Answers

AWS Compute Optimizer Interview Questions and Answers

AWS CodeStar Interview Questions and Answers

 

Ques. 13): When a crawler runs, what happens?

Answer:

To examine a data storage, a crawler performs the following actions:

  • Create a custom classifier to customise the results of classification in order to determine the raw data's format, schema, and associated attributes.
  • Data is organised into tables or partitions based on crawler algorithms.
  • You can control how the crawler adds, changes, and deletes tables and partitions by configuring how it writes metadata to the Data Catalog.

 

AWS Secrets Manager Interview Questions and Answers

AWS CloudShell Interview Questions and Answers

AWS Batch Interview Questions and Answers

 

Ques. 14): When should I utilise Amazon EMR vs. AWS Glue?

Answer:

AWS Glue is a scale-out execution environment for your data transformation activities that runs on top of the Apache Spark ecosystem. AWS Glue infers, adapts, and monitors your ETL jobs, making job creation and maintenance more easier. Amazon EMR gives you direct access to your Hadoop environment, allowing you to access it at a lower level and use tools other than Apache Spark.

 

AWS Django Interview Questions and Answers

AWS App2Container Questions and Answers

AWS App Runner Questions and Answers

 

Ques. 15): What options do I have for customising the ETL code provided by AWS Glue?

Answer:

Scala or Python code is generated by AWS Glue's ETL script suggestion algorithm. It makes use of Glue's custom ETL framework to make it easier to access data sources and manage job execution. You can use AWS Glue's own library to write ETL code, or you can use inline editing in the AWS Glue Console script editor to write arbitrary code in Scala or Python, then download the auto-generated code and edit it in your own IDE.   

 

AWS Cloud Support Engineer Interview Question and Answers

AWS Timestream Interview Questions and Answers

AWS PinPoint Questions and Answers

 

Ques. 16): How am I charged for AWS Glue?

Answer:

Over and beyond the AWS Glue Data Catalog free tier, you'll pay a basic monthly cost to store and retrieve metadata in the AWS Glue Data Catalog. The crawler run will cost you an hourly charge, paid per second, with a 10-minute minimum. If you want to use a development endpoint to create your ETL code interactively, you will be charged an hourly rate, billed per second, for the time it takes to provision your development endpoint, with a 10-minute minimum. In addition, depending on the Glue version you choose, you'll pay an hourly cost, billed per second, for the ETL process, with a 1-minute or 10-minute minimum.

 

AWS Solution Architect Interview Questions and Answers

AWS Neptune Interview Questions and Answers

AWS MemoryDB Questions and Answers

 

Ques. 17): Is it possible to monitor and troubleshoot AWS Glue ETL operations using the Apache Spark web UI?

Answer:

Yes, you can monitor and debug AWS Glue ETL processes running on the AWS Glue job system, as well as Apache Spark applications running on AWS Glue development endpoints, using the Apache Spark web UI. For each job, the Spark UI allows you to verify the following:

  • Each Spark stage's event chronology
  • The job's directed acyclic graph (DAG).
  • SparkSQL query physical and logical plans
  • For each job, the underlying Spark environmental variables

 

AWS Aurora Interview Questions and Answers

AWS EventBridge Interview Questions and Answers

 

Ques. 18): What happens if AWS Glue encounters an ETL error?

Answer:

AWS Glue keeps track of job event metrics and faults and sends all alerts to Amazon CloudWatch. With Amazon CloudWatch, you can set up a variety of actions to be triggered in response to certain AWS Glue notifications. You may use an AWS Lambda function to handle an error or success notification from Glue, for example. The default retry behaviour in Glue is to retry all failures three times before sending an error notification.

 

AWS DevOps Cloud Interview Questions and Answers

 

Ques. 19): When a glue crawler decides to construct partitions, how does it do so?

Answer:

When an AWS Glue crawler searches an Amazon S3 path and finds several folders in a bucket, it determines which folders are table partitions and the root of a table in the folder structure. The table's name is derived from the Amazon S3 prefix, or folder name. You specify an Include path that points to the crawled folder level. The crawler makes divisions of a table instead of two independent tables when the majority of schemas at a folder level are similar.

 

AWS AppSync Interview Questions and Answers

 

Ques. 20): AWS Glue Elastic Views currently supports which sources and targets?

Answer:

Amazon DynamoDB is now supported for the preview, with Amazon Aurora MySQL, Amazon Aurora PostgreSQL, Amazon RDS for MySQL, and Amazon RDS for PostgreSQL to follow. Amazon Redshift, Amazon S3, and Amazon OpenSearch Service are now supported targets, with support for Amazon Aurora MySQL, Amazon Aurora PostgreSQL, Amazon RDS for MySQL, and Amazon RDS for PostgreSQL on the way.

 

AWS Database Interview Questions and Answers

 

 


More AWS interview Questions and Answers:



AWS Glue Interview Questions and Answers


Amazon Athena Interview Questions and Answers


AWS VPC Interview Questions and Answers


AWS DevOps Cloud Interview Questions and Answers


AWS Cloud9 Interview Questions and Answers


AWS Database Interview Questions and Answers


AWS ActiveMQ Interview Questions and Answers


AWS CloudFormation Interview Questions and Answers


AWS GuardDuty Questions and Answers


AWS Control Tower Interview Questions and Answers


AWS Lake Formation Interview Questions and Answers


AWS Data Pipeline Interview Questions and Answers


Amazon CloudSearch Interview Questions and Answers 


AWS Transit Gateway Interview Questions and Answers


Amazon Detective Interview Questions and Answers