A set of rules that controls access to resources based on permissions.
A software development methodology focused on iterative development, where requirements and solutions evolve through collaboration.
A formal proclamation of four key values and 12 principles to guide an iterative and people-focused approach to software development.
A group of software development methods based on iterative development, where solutions evolve through collaboration.
The simulation of human intelligence processes by machines, especially computer systems.
The identification of unusual patterns that do not conform to expected behavior, is often used in fraud detection and network security.
A distributed event streaming platform used for building real-time data pipelines and streaming applications.
A set of rules that allows different software entities to communicate with each other.
A tool that sits in front of an API and acts as a reverse proxy to accept all API calls, aggregate the various services required to fulfill them, and return the appropriate result.
The process of controlling the rate of traffic sent or received by an API to prevent abuse or overuse.
The process of managing the life of an application from conception through development, testing, deployment, support, and retirement.
A repository where binary files, libraries, or artifacts created during the software development process are stored and managed.
A comprehensive and widely adopted cloud platform, offering various services for computing, storage, and networking.
The process of verifying the identity of a user, device, or entity in a computer system.
The process of determining if a user, device, or entity has permission to access a resource or perform an action.
The process of automatically adjusting the number of computing resources allocated to handle the load of an application.
The use of software tools to execute tests automatically, manage test data, and utilize the results to improve software quality.
Server-side development that involves databases, servers, and application logic.
The maximum rate of data transfer across a given path. In computing, it often refers to the volume of information per unit of time that a transmission medium can handle.
An agile software development technique that encourages collaboration among developers, QA, and non-technical or business participants in a software project.
Large and complex data sets that require advanced analysis techniques.
The most basic form of computer code or programming data, using a binary system of 0s and 1s.
A system of recording information in a way that makes it difficult or impossible to change, hack, or cheat the system. It is a digital ledger of transactions that is duplicated and distributed across the entire network of computer systems.
A set of rules that defines how data is structured, shared, and validated across a blockchain network.
A popular front-end open-source toolkit for developing responsive, mobile-first projects on the web using HTML, CSS, and JavaScript.
A point of congestion in a system that prevents it from performing optimally.
A type of network configuration used in Docker containers, allows multiple containers to communicate on the same host.
A deal offered by many websites, organizations, and software developers by which individuals can receive recognition and compensation for reporting bugs, especially security exploits and vulnerabilities.
Technologies and strategies used by enterprises for data analysis and business information.
The use of technology to execute recurring tasks or processes in a business where manual effort can be replaced.
The process of storing data in a temporary storage area improves performance and reduces latency.
In distributed data stores, it is impossible to simultaneously provide more than two out of three guarantees: Consistency, Availability, and Partition Tolerance.
A style sheet language used for describing the presentation of a document written in HTML or XML.
Continuous Integration/Continuous Deployment Pipelines automate software delivery processes, including building, testing, and deploying code.
The discipline of experimenting on a software system to build confidence in the system’s capability to withstand turbulent conditions in production.
A tool developed by Netflix to randomly terminate instances in their production environment to ensure that engineers implement their services to be resilient to instance failures.
The use of a group of linked computers to work together so that they can be viewed as a single system.
The delivery of computing services over the internet, including storage, processing, and networking.
Describes applications designed to run in a cloud environment, taking full advantage of cloud computing models, services, and infrastructure.
Applications are designed to leverage cloud environments fully, often using microservices, containers, and continuous delivery.
A framework that decouples hardware resources from the physical configuration, allowing them to be treated as services.
A database is used by an organization to store information about hardware and software assets, often used in IT service management.
A DevOps practice where code changes are automatically tested and deployed.
The automated arrangement, coordination, and management of software containers.
The process of packaging software with its dependencies to run consistently across different computing environments.
A mechanism that allows restricted resources on a web page to be requested from another domain outside the domain from which the resource originated.
Software that helps manage a company’s interactions with current and potential customers.
Practices and technologies designed to protect computers, networks, and data from unauthorized access or attacks.
The process of examining data sets to conclude the information they contain.
The process of converting data into a code to prevent unauthorized access.
The management of data availability, usability, integrity, and security in enterprise systems.
A centralized repository designed to store, process, and secure large amounts of structured, semi-structured, and unstructured data.
A strategy for making sure that end users do not send sensitive or critical information outside the corporate network.
The process of transferring data between different storage types, formats, or computer systems.
A set of processes that systematically move data from one system to another.
A centralized repository for storing large amounts of structured and unstructured data from various sources.
A method of splitting a database into smaller, more manageable pieces called shards that can be spread across multiple servers.
A subset of machine learning that uses neural networks with many layers, often applied in image and speech recognition.
A set of practices that combines software development and IT operations to shorten the development lifecycle.
The integration of digital technology into all business areas, fundamentally changes how you operate and deliver value to customers.
A documented, structured approach with instructions for responding to unplanned incidents that threaten an IT infrastructure, including hardware, software, networks, processes, and people.
A computing system in which multiple components located on different networked computers communicate and coordinate their actions by passing messages.
A type of cyber attack where multiple systems overwhelm the bandwidth or resources of a targeted system, usually one or more web servers.
A digital system for recording the transaction of assets in which the transactions and their details are recorded in multiple places at the same time.
A platform for developing, shipping, and running applications inside containers.
A software development approach that emphasizes collaboration between technical experts and domain experts.
A security testing method that involves testing an application or software product in its running state.
A distributed computing paradigm that brings computation and data storage closer to the location where it is needed to improve response times and save bandwidth.
A device or component located at the edge of a network that serves as an entry or exit point for data entering or leaving the network.
An AWS service that allows users to deploy and manage applications in the AWS Cloud without worrying about the infrastructure that runs those applications.
A service that automatically distributes incoming application traffic across multiple targets, such as EC2 instances.
A set of open-source tools from Elastic, including Elasticsearch, Logstash, Kibana, and Beats, is used for searching, analyzing, and visualizing real-time data.
An open-source, distributed search and analytics engine for all types of data, including textual, numerical, geospatial, structured, and unstructured.
The use of software and computer systems architectural principles to integrate a set of enterprise computer applications.
A software architecture model is used for designing and implementing communication between mutually interacting software applications in a service-oriented architecture.
Software that manages business processes by integrating all facets of an operation, including planning, development, sales, and marketing.
A data warehousing process that extracts data from multiple sources, transforms it into a suitable format and loads it into a data warehouse.
A pattern in which changes to the application state are stored as a sequence of events.
The practice of capturing data in real-time from event sources like databases, sensors, mobile devices, cloud services, and software applications in the form of streams.
A software architecture pattern promoting the production, detection, consumption of, and reaction to events.
A cybersecurity technology that addresses the need for continuous monitoring and response to advanced threats.
The practice of securing endpoints or entry points of end-user devices like desktops, laptops, and mobile devices from being exploited by malicious actors.
The process of converting data into a coded format to prevent unauthorized access.
The practice of managing computing environments such as development, testing, and production in a declarative and programmatic manner.
A backup operational mode in which the functions of a system component (such as a processor, server, network, or database) are assumed by secondary system components when the primary component becomes unavailable.
A software development practice that allows developers to enable or disable features or functionalities remotely without deploying new code.
A technique that allows developers to turn features on or off in an application without deploying new code.
An arrangement that can be made among multiple enterprises that lets subscribers use the same identification data to obtain access to the networks of all enterprises in the group.
A machine learning technique that trains an algorithm across multiple decentralized devices or servers holding local data samples, without exchanging them.
The process of searching multiple data sources from a single search interface, returning integrated search results.
Internal control or process that performs the act of validating the integrity of the operating system and application software files using a verification method between the current file state and a known, good baseline.
A standard network protocol is used for the transfer of computer files between a client and server on a computer network.
A network security system that monitors and controls incoming and outgoing network traffic.
A cloud service that provides firewall protection to an organization’s IT infrastructure.
A developer who is skilled in working on both the front-end and back-end portions of an application.
The development of both the front end (client-side) and the back end (server-side) of a web application.
A cloud computing service that allows customers to execute code in response to events without the complexity of building and maintaining the infrastructure.
A standardized method for measuring the functional size of a software application.
A programming paradigm where programs are constructed by applying and composing functions.
A type of bar chart that illustrates a project schedule and shows the dependency relationships between activities and the current schedule status.
The process of automatically freeing memory on the heap by deleting objects that are no longer accessible in a program.
A concept that emphasizes the importance of correct input data to ensure correct output or results.
A distributed version-control system for tracking changes in source code during software development.
An operational framework that takes DevOps best practices used for application development, such as version control, collaboration, compliance, and CI/CD, and applies them to infrastructure automation.
A continuous deployment methodology that uses Git repositories as the source of truth for declarative infrastructure and applications.
A CI/CD and automation platform that allows users to automate their software workflows directly from their GitHub repositories.
A suite of cloud computing services runs on the same infrastructure that Google uses internally for its end-user products.
A specialized processor designed to accelerate graphics rendering and perform parallel processing.
A database is designed to treat the relationships between data as equally important to the data itself. It is intended to hold data without constricting it to a pre-defined model.
A query language for your API that allows clients to request exactly the data they need.
A tool for chaos engineering that helps companies ensure their systems can withstand failures and errors.
A project that is started from scratch without needing to consider any prior work.
An open-source framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models.
A distributed file system designed to run on commodity hardware, part of the Hadoop ecosystem.
The process of converting an input (or ‘message’) into a fixed-size string of bytes, typically a hash code.
A system design approach and associated service implementation that ensures a certain level of operational performance, usually uptime, for a higher than normal period.
A network that consists of different types of devices and systems, such as a mix of hardware, operating systems, and applications.
A security mechanism set to detect, deflect, or counteract attempts at unauthorized use of information systems.
A quick fix or patch to a bug or security issue in a software product that is deployed without the usual testing process due to the urgency of the fix.
A computing environment that combines a public cloud and a private cloud by allowing data and applications to be shared between them.
The practice of using both in-house and cloud-based services to manage a company’s information technology environment.
The use of advanced technologies, such as artificial intelligence and machine learning, to automate tasks that were once done by humans.
An IT framework that combines storage, computing, and networking into a single system to reduce data center complexity and increase scalability.
An extension of HTTP used for secure communication over a computer network, widely used on the internet.
Software, firmware, or hardware that creates and runs virtual machines by separating the underlying hardware from the operating systems.
A form of cloud computing that provides virtualized computing resources over the internet.
A framework of policies and technologies for ensuring that the proper people in an enterprise have the appropriate access to technology resources.
A cloud-based service that provides identity management solutions, including single sign-on (SSO), authentication, and identity governance.
A system entity that creates, maintains, and manages identity information for principals and provides principal authentication to other service providers within a federation.
An approach where servers are never modified after deployment. Instead, if something needs to be updated, a new server is built from a common image.
An object whose state cannot be modified after it is created is often used in functional programming to prevent side effects.
The process used by DevOps and IT operations teams to respond to an unplanned event or service interruption and restore the service to its operational state.
The process of managing and provisioning computing infrastructure through machine-readable definition files, rather than physical hardware.
The automated arrangement, coordination, and management of complex computer systems, middleware, and services.
An application that runs in a Kubernetes cluster and configures an HTTP load balancer according to Ingress resources.
Network traffic that originates from outside a network and is transmitted to a host or server inside the network.
The process of combining different computing systems and software applications to work together within a larger system.
The practice of storing data in the main RAM of servers rather than on slower disk-based storage significantly speeds up data processing.
Pieces of forensic data that identify potentially malicious activity on a system or network.
The interconnection of computing devices embedded in everyday objects enables them to send and receive data.
A strategic approach to designing, delivering, managing, and improving the way IT is used within an organization.
An abstract machine that enables a computer to run Java programs as well as programs written in other languages that are also compiled to Java bytecode.
A programming language commonly used to create interactive effects within web browsers.
An open-source automation server used to automate parts of software development related to building, testing, and deploying.
A suite of plugins that supports implementing and integrating continuous delivery pipelines into Jenkins.
A proprietary issue-tracking product developed by Atlassian that allows bug tracking and agile project management.
The process of automating the execution of tasks, jobs, or workflows in computing environments.
A lightweight format for storing and transporting data, often used when data is sent from a server to a web page.
An open standard (RFC 7519) defines a compact and self-contained way for securely transmitting information between parties as a JSON object.
A runtime environment component that improves the performance of interpreted programs by translating bytecode into native machine code at runtime.
A software architecture pattern designed to handle real-time data processing.
A network authentication protocol designed to provide strong authentication for client/server applications.
A secure method for authenticating a request for a service in a computer network.
A service that helps you create, manage, and control cryptographic keys across various applications and services.
The process of periodically changing cryptographic keys limits the amount of data encrypted by the same key and reduces the impact of key compromise.
A data visualization and exploration tool used for log and time-series analytics, application monitoring, and operational intelligence use cases.
A suite of services provided by AWS for real-time data processing, including ingesting, processing, and analyzing streaming data.
An open-source platform for automating the deployment, scaling, and management of containerized applications.
A method of packaging, deploying, and managing a Kubernetes application.
The delay before a transfer of data begins following an instruction for its transfer.
Techniques and strategies used to reduce the time delay in data communication and processing.
A security principle in which a user is given the minimum levels of access – or permissions – needed to perform their job functions.
A protocol for accessing and maintaining distributed directory information services over a network.
A device that distributes network or application traffic across several servers to ensure reliability and performance.
A method used by load balancers to distribute incoming traffic among servers to ensure no single server is overwhelmed.
The practice of dropping some load on a system when it is overwhelmed, to maintain responsiveness for high-priority tasks.
The practice of putting demand on a software system and measuring its response.
The practice of analyzing log data to extract meaningful insights for monitoring and troubleshooting purposes.
The process of dealing with computer-generated log messages, which includes their collection, storage, and analysis.
A software development approach that requires little to no coding to build applications and processes.
A subset of artificial intelligence that involves the use of data and algorithms to imitate the way humans learn, gradually improving its accuracy.
A company that remotely manages a customer’s IT infrastructure and/or end-user systems, typically on a proactive basis and under a subscription model.
A model for providing web and mobile app developers with a way to link their applications to backend cloud storage and APIs exposed by backend applications.
A form of asynchronous service-to-service communication is used in serverless and microservices architectures.
The practice of managing and governing data about other data is often used in data lakes and data warehouses.
An architectural style where independently deliverable frontend applications are composed into a greater whole.
An architectural style that structures an application as a collection of small autonomous services modeled around a business domain.
An architectural style that structures an application as a collection of loosely coupled services.
Software that acts as a bridge between an operating system or database and applications, especially on a network.
A set of practices that combines Machine Learning, DevOps, and data engineering to deploy and maintain ML systems in production reliably and efficiently.
A traditional unified model for the design of a software program. It often contrasts with microservices architecture.
An architecture in which a single software instance serves multiple customers (tenants).
A container for a set of identifiers, allows the same identifier to be used in different contexts without collision.
The practice of isolating groups of resources or processes to prevent them from interacting or interfering with each other.
A method of remapping one IP address space into another by modifying network address information in the IP header of packets.
A file-level storage architecture that makes stored data more accessible to networked devices.
A network architecture concept that uses IT virtualization technologies to virtualize entire classes of network node functions into building blocks that may connect or chain together to create communication services.
The totality of all hardware and software components that build the environment in which VNFs are deployed.
A load balancer that directs client requests at the transport layer (TCP/UDP) and routes them to backend servers.
Policies and practices adopted to prevent and monitor unauthorized access, misuse, modification, or denial of a computer network and its resources.
The arrangement of the various elements (links, nodes, etc.) of a computer network, is often represented as a graph.
The process of combining hardware and software network resources and network functionality into a single, software-based administrative entity.
A concept that represents the goal of completely automating the deployment, monitoring, and management of applications and the infrastructure on which they run.
A non-relational database that allows for storage and retrieval of data modeled in means other than tabular relations used in relational databases.
An open standard for access delegation is commonly used as a way to grant websites or applications access to information on other websites without exposing passwords.
Permissions that define what access levels are given to third-party applications when accessing user data via OAuth.
The second version of the OAuth protocol focuses on client developer simplicity while providing specific authorization flows for web applications, desktop applications, mobile phones, and living room devices.
A type of storage architecture that manages data as objects, in contrast to file systems, which manage data as a file hierarchy, and block storage, which manages data as blocks within sectors and tracks.
The measure of how well you can understand the internal states of a system based on the data it produces.
A suite of tools and processes used to monitor the health and performance of applications and infrastructure, often including metrics, logs, and tracing.
Software and technology that is located within the physical confines of an enterprise – often in the company’s data center – as opposed to running remotely on hosted servers or in the cloud.
Software for which the source code is made freely available and may be redistributed and modified.
A family of containerization software developed by Red Hat for Kubernetes, including an enterprise Kubernetes container platform.
A service mesh for OpenShift, providing consistent management, security, and observability across microservices.
An open-source software platform for cloud computing, primarily deployed as infrastructure-as-a-service.
A simple identity layer on top of the OAuth 2.0 protocol, allows clients to verify the identity of the end-user based on the authentication performed by an authorization server.
An alerting and incident response solution for development and operations teams used to manage and resolve incidents faster.
A multinational computer technology corporation that sells database software and technology, cloud engineered systems, and enterprise software products.
The automated arrangement, coordination, and management of complex computer systems, middleware, and services.
A simulated cyber attack against your computer system to check for exploitable vulnerabilities.
A cloud computing service that provides a platform allowing customers to develop, run, and manage applications without the complexity of building and maintaining the infrastructure.
The practice of building and maintaining a self-service platform for developers to deploy their applications.
The coexistence of multiple kinds of database technologies is chosen based on their particular strengths and weaknesses.
A form of advanced analytics that makes predictions about future events, typically using statistical algorithms and machine learning techniques.
A cloud computing model where the infrastructure is dedicated to a single organization, offering more control over data, security, and quality of service.
A framework for creating a secure method for exchanging information based on public key cryptography.
The process of setting up IT infrastructure, such as servers, storage, network equipment, and other resources.
A messaging pattern where senders of messages, called publishers, do not program the messages to be sent directly to specific receivers, called subscribers.
A high-level programming language widely used in software development, data science, and machine learning.
The description or measurement of the overall performance of a service, such as a telephony or computer network or a cloud computing service.
A data structure or service that manages a sequence of elements with a first-in, first-out (FIFO) ordering.
The number of outstanding requests or operations waiting to be processed by a device or system.
The capability to run a single query that fetches data from multiple heterogeneous data sources.
A type of computer language used to make queries in databases and information systems.
Techniques used to improve query performance in databases.
A quantum algorithm for finding the global minimum of a function over a given set of candidate solutions is widely used in quantum computing.
A type of computing that takes advantage of quantum phenomena like superposition and entanglement to perform computation.
A method of encryption that leverages the principles of quantum mechanics to secure data, ensuring it remains confidential and integral.
A secure communication method that implements a cryptographic protocol involving components of quantum mechanics.
The minimum number of members of a group or committee that must be present to make the proceedings of that meeting valid.
The process of controlling the rate of traffic sent or received by an API to prevent abuse or overuse.
A tool or technique used to control the amount of traffic sent or received in a network.
A database management system (DBMS) based on the relational model introduced by E.F. Codd.
A Kubernetes distribution focused on developer experience and application security that’s platform-agnostic.
A data storage virtualization technology that combines multiple physical disk drive components into one or more logical units.
The delay between the time when data is updated on the primary database and when the update is applied to the replica database.
A log that is copied across multiple machines, is often used in distributed systems to ensure consistency and reliability.
An architectural style for designing networked applications, relying on a stateless, client-server communication protocol, usually HTTP.
The ability of a system or network to withstand and recover quickly from difficulties.
A server that sits in front of web servers and forwards client requests (like web browsers) to those web servers.
The use of software with artificial intelligence (AI) and machine learning capabilities to handle high-volume, repeatable tasks that previously required a human to perform.
A method for managing time-shared resources in which each task is assigned a fixed time slice in a cyclic order.
A software release strategy where an update is gradually rolled out to users and infrastructure in a staggered approach.
A software distribution model in which applications are hosted by a vendor or service provider and made available to customers over the Internet.
A cloud-based software company that provides customer relationship management service and a suite of enterprise applications focused on customer service, marketing automation, analytics, and application development.
The capability of a system, network, or process to handle a growing amount of work or its potential to be enlarged to accommodate that growth.
An agile process framework for managing complex knowledge work, with an initial emphasis on software development.
A facilitator for an agile development team. Scrum is a methodology that allows a team to self-organize and make changes quickly.
A set of tools and services offering a holistic view of an organization’s information security.
A cloud computing execution model where the cloud provider dynamically manages the allocation and provisioning of servers.
A cloud-computing execution model in which the cloud provider runs the server, and dynamically manages the allocation of machine resources.
A formal commitment between a service provider and a client that outlines the expected level of service and responsibilities.
An architectural pattern in software design in which application components provide services to other components via a communications protocol over a network.
A discipline that incorporates aspects of software engineering and applies them to infrastructure and operations problems.
A centralized function within an organization employs people, processes, and technology to continuously monitor and improve an organization’s security posture.
A collection of software tools and libraries that developers use to build applications for specific platforms.
An approach to networking that uses open protocols to enable software control of networks instead of hardware control.
A virtual WAN architecture that allows enterprises to securely connect any user to any application, leveraging the internet or private MPLS.
Refers to whether a system maintains state information across sessions (stateful) or treats each session as independent and maintains no state (stateless).
The practice of monitoring applications by simulating user interactions.
Syntax within a programming language is designed to make things easier to read or to express, which makes the language “sweeter” for human use.
An open-source infrastructure as a code software tool that provides a consistent CLI workflow to manage hundreds of cloud services.
The concept of expediting software development results in more costly fixes and updates in the future due to rushed or incomplete coding.
Short for technical debt, which refers to the implied cost of additional rework caused by choosing an easy solution now instead of using a better approach that would take longer.
A software development process that relies on the repetition of a very short development cycle: requirements are turned into specific test cases, then the software is improved to pass the new tests.
A container for multiple resources that are used together. A module could be used to represent a component, such as a set of Amazon VPC resources.
A process by which potential threats, such as structural vulnerabilities or the absence of appropriate safeguards, can be identified, enumerated, and mitigated.
The process of protecting sensitive data by replacing it with an algorithmically generated number called a token.
The practice of regulating network data transfer to assure a certain level of performance, quality, or bandwidth.
A history of actions executed by a database management system used for recovery in case of a failure.
A cryptographic protocol designed to provide communications security over a computer network.
A system of data manipulation rules is said to be “Turing complete” if it can be used to simulate any Turing machine.
An extra layer of security is used to ensure that people trying to gain access to an online account are who they say they are.
A security measure that requires two individuals to be present or participate in a sensitive activity to prevent malicious actions.
The last phase of the software testing process is where actual software users test the software to make sure it can handle required tasks in real-world scenarios.
UI refers to the aesthetic elements by which people interact with a product, while UX refers to the experience a user has when interacting with a product or service.
A framework for integrating various real-time communication tools such as voice, video, chat, and email.
A standardized modeling language is used to specify, visualize, construct, and document the artifacts of software systems.
A single solution that provides multiple security functions, such as antivirus, anti-spyware, anti-spam, network firewall, intrusion detection and prevention, and content filtering.
A set of networking protocols that permits networked devices to seamlessly discover each other’s presence on the network and establish functional network services.
A device that provides emergency power to a load when the input power source or mains power fails.
A software testing method by which individual units of source code are tested to determine whether they are fit for use.
A written description of how users will perform tasks on your website or application, detailing the user’s interactions and system responses.
A communication protocol is used across the internet for especially time-sensitive transmissions such as video playback or DNS lookups.
The point of interaction between a user and a digital device or application, including screens, pages, buttons, icons, and any visual element.
An informal, natural language description of one or more features of a software system, written from the perspective of an end user or user of a system.
A visual exercise that helps product teams define the work that will create the most delightful user experience.
Increasing the capacity of a single server, such as adding more CPU or RAM to handle more load.
A system that records changes to a file or set of files over time so that you can recall specific versions later.
A technology used to create a virtualized desktop environment on a remote server setup.
An emulation of a computer system, providing the functionality of a physical computer.
A private cloud exists within a shared or public cloud, allowing users to control the virtual networking environment.
The process of creating a virtual version of something, including virtual computer hardware platforms, storage devices, and computer network resources.
A cloud computing virtualization platform for building cloud infrastructures, primarily used in data centers.
A collection of logical volumes in storage that can be managed together.
A feature in Microsoft Windows that allows taking manual or automatic backup copies or snapshots of computer files or volumes.
A service that encrypts your internet traffic and protects your online identity by hiding your IP address.
A group of host computers and servers that are configured as if they are on the same LAN even if they reside across multiple different physical LANs.
A telecommunications network that extends over a large geographic area for the primary purpose of computer networking.
A firewall that monitors, filters, and blocks data packets as they travel to and from a web application.
The work involved in developing a website for the internet or an intranet.
A standardized way of integrating web-based applications using open standards over an internet protocol backbone.
A binary instruction format for a stack-based virtual machine, designed to be a portable target for the compilation of high-level languages like C, C++, and Rust.
A free, open-source project that provides web browsers and mobile applications with real-time communication via simple application programming interfaces.
Automated messages sent from apps when something happens are often used in APIs to provide real-time information to another app.
A visual guide that represents the skeletal framework of a website or application.
The design, execution, and automation of processes based on pre-defined business rules.
Software designed to help streamline and automate business processes, improving efficiency and consistency.
A method used in databases to provide atomicity and durability (two of the ACID properties) in database transactions.
A collective term refers to delivering IT services via the cloud rather than on-premises.
A language for selecting nodes from an XML document is also used with XSLT to extract values from XML documents.
A standard defining the format of public key certificates is used in many internet protocols, including TLS/SSL.
A hypervisor that enables multiple operating systems to run on the same physical server at the same time.
A markup language that defines a set of rules for encoding documents in a format that is both human-readable and machine-readable.
A way to describe the structure and validate the content of XML documents.
A protocol for real-time, decentralized messaging, presence, and request-response services.
A type of security vulnerability typically found in web applications, where attackers inject malicious scripts into content from otherwise trusted websites.
A feature in browsers and web applications to protect users from cross-site scripting attacks.
A human-readable data serialization standard that can be used in conjunction with all programming languages and is often used to write configuration files.
A data modeling language used to model configuration and state data manipulated by the Network Configuration Protocol (NETCONF).
A package manager for the JavaScript programming language that acts as an alternative to npm.
A command-line package management utility for computers running the Linux operating system using the RPM Package Manager.
A business intelligence and analytics software suite that offers data visualization, dashboards, and reporting.
A unit of data storage size equal to 1 septillion bytes (10^24 bytes) or 1 trillion terabytes.
A term used in data analytics and financial technology to analyze the relationship between interest rates and time to maturity.
An open-source monitoring software tool for diverse IT components, including networks, servers, virtual machines, and cloud services.
A cyber attack that occurs on the same day a weakness is discovered in software before the developer has been able to fix it.
A software security flaw that is known to the software vendor but does not have a patch in place to fix the flaw.
A security concept centered on the belief that organizations should not automatically trust anything inside or outside its perimeters and must verify anything and everything trying to connect to its systems before granting access.
A security model that requires strict identity verification for every person and device trying to access resources on a private network, regardless of whether they are inside or outside the network perimeter.
A security model that requires strict identity verification for every person and device trying to access resources on a private network, regardless of whether they are inside or outside the network perimeter.
A centralized service for maintaining configuration information, naming, providing distributed synchronization, and providing group services in a distributed environment.
A group of Zookeeper servers that work together to provide reliable coordination services to distributed systems.
A DNS transaction that moves all or part of a domain from a primary to a secondary DNS server.
If you are looking for ways to bring your product or app ideas to life? We’ve got your back. CodeBeavers has the tools and engineers you need to make your projects come alive. With CodeBeavers, you’ll be able to build faster than ever, deploy code with ease, and scale like never before. Send us your requirements now, and let’s start winning together.