home / skills / benchflow-ai / skillsbench / polyglot-integration
/registry/terminal_bench_2.0/full_batch_reviewed/terminal_bench_2_0_polyglot-c-py/environment/skills/polyglot-integration
This skill orchestrates multi-language integration using FFI, gRPC, and bindings to harness diverse ecosystems and legacy systems efficiently.
npx playbooks add skill benchflow-ai/skillsbench --skill polyglot-integrationReview the files below or copy the command above to add this skill to your agents.
---
name: polyglot-integration
description: Integrate multiple programming languages using FFI, native bindings, gRPC, or language bridges. Use when combining strengths of different languages or integrating legacy systems.
---
# Polyglot Integration
## Overview
Integrate code written in different programming languages to leverage their unique strengths and ecosystems.
## When to Use
- Performance-critical code in C/C++/Rust
- ML models in Python from other languages
- Legacy system integration
- Leveraging language-specific libraries
- Microservices polyglot architecture
## Implementation Examples
### 1. **Node.js Native Addons (C++)**
```cpp
// addon.cc
#include <node.h>
namespace demo {
using v8::FunctionCallbackInfo;
using v8::Isolate;
using v8::Local;
using v8::Object;
using v8::String;
using v8::Value;
using v8::Number;
void Add(const FunctionCallbackInfo<Value>& args) {
Isolate* isolate = args.GetIsolate();
if (args.Length() < 2) {
isolate->ThrowException(v8::Exception::TypeError(
String::NewFromUtf8(isolate, "Wrong number of arguments")));
return;
}
if (!args[0]->IsNumber() || !args[1]->IsNumber()) {
isolate->ThrowException(v8::Exception::TypeError(
String::NewFromUtf8(isolate, "Arguments must be numbers")));
return;
}
double value = args[0]->NumberValue() + args[1]->NumberValue();
Local<Number> num = Number::New(isolate, value);
args.GetReturnValue().Set(num);
}
void Initialize(Local<Object> exports) {
NODE_SET_METHOD(exports, "add", Add);
}
NODE_MODULE(NODE_GYP_MODULE_NAME, Initialize)
}
```
```javascript
// Usage in Node.js
const addon = require('./build/Release/addon');
console.log(addon.add(3, 5)); // 8
```
### 2. **Python from Node.js**
```typescript
// Using child_process
import { spawn } from 'child_process';
function callPython(script: string, args: any[] = []): Promise<any> {
return new Promise((resolve, reject) => {
const python = spawn('python3', [script, ...args.map(String)]);
let stdout = '';
let stderr = '';
python.stdout.on('data', (data) => {
stdout += data.toString();
});
python.stderr.on('data', (data) => {
stderr += data.toString();
});
python.on('close', (code) => {
if (code !== 0) {
reject(new Error(stderr));
} else {
try {
resolve(JSON.parse(stdout));
} catch (error) {
resolve(stdout);
}
}
});
});
}
// Usage
const result = await callPython('./ml_model.py', [100, 200]);
console.log('Python result:', result);
```
```python
# ml_model.py
import sys
import json
def predict(x, y):
# ML model logic
return x * 2 + y
if __name__ == '__main__':
x = int(sys.argv[1])
y = int(sys.argv[2])
result = predict(x, y)
print(json.dumps({'prediction': result}))
```
### 3. **Rust from Python (PyO3)**
```rust
// lib.rs
use pyo3::prelude::*;
#[pyfunction]
fn fibonacci(n: u64) -> PyResult<u64> {
let mut a = 0;
let mut b = 1;
for _ in 0..n {
let temp = a;
a = b;
b = temp + b;
}
Ok(a)
}
#[pyfunction]
fn process_large_array(arr: Vec<f64>) -> PyResult<f64> {
Ok(arr.iter().sum())
}
#[pymodule]
fn rust_extension(_py: Python, m: &PyModule) -> PyResult<()> {
m.add_function(wrap_pyfunction!(fibonacci, m)?)?;
m.add_function(wrap_pyfunction!(process_large_array, m)?)?;
Ok(())
}
```
```python
# Usage in Python
import rust_extension
result = rust_extension.fibonacci(100)
print(f"Fibonacci: {result}")
arr = list(range(1000000))
total = rust_extension.process_large_array(arr)
print(f"Sum: {total}")
```
### 4. **gRPC Polyglot Communication**
```protobuf
// service.proto
syntax = "proto3";
service Calculator {
rpc Add (NumberPair) returns (Result);
rpc Multiply (NumberPair) returns (Result);
}
message NumberPair {
double a = 1;
double b = 2;
}
message Result {
double value = 1;
}
```
```python
# Python gRPC Server
import grpc
from concurrent import futures
import service_pb2
import service_pb2_grpc
class Calculator(service_pb2_grpc.CalculatorServicer):
def Add(self, request, context):
return service_pb2.Result(value=request.a + request.b)
def Multiply(self, request, context):
return service_pb2.Result(value=request.a * request.b)
def serve():
server = grpc.server(futures.ThreadPoolExecutor(max_workers=10))
service_pb2_grpc.add_CalculatorServicer_to_server(Calculator(), server)
server.add_insecure_port('[::]:50051')
server.start()
server.wait_for_termination()
if __name__ == '__main__':
serve()
```
```typescript
// Node.js gRPC Client
import * as grpc from '@grpc/grpc-js';
import * as protoLoader from '@grpc/proto-loader';
const packageDefinition = protoLoader.loadSync('service.proto');
const proto = grpc.loadPackageDefinition(packageDefinition);
const client = new proto.Calculator(
'localhost:50051',
grpc.credentials.createInsecure()
);
client.Add({ a: 10, b: 20 }, (error, response) => {
if (error) {
console.error(error);
} else {
console.log('Result:', response.value);
}
});
```
### 5. **Java from Python (Py4J)**
```java
// JavaApp.java
public class JavaApp {
public int add(int a, int b) {
return a + b;
}
public String processData(String data) {
return data.toUpperCase();
}
public static void main(String[] args) {
JavaApp app = new JavaApp();
GatewayServer server = new GatewayServer(app);
server.start();
}
}
```
```python
# Python client
from py4j.java_gateway import JavaGateway
gateway = JavaGateway()
app = gateway.entry_point
result = app.add(10, 20)
print(f"Result: {result}")
processed = app.processData("hello world")
print(f"Processed: {processed}")
```
## Best Practices
### ✅ DO
- Use appropriate IPC mechanism
- Handle serialization carefully
- Implement proper error handling
- Consider performance overhead
- Use type-safe interfaces
- Document integration points
### ❌ DON'T
- Pass complex objects across boundaries
- Ignore memory management
- Skip error handling
- Use blocking calls in async code
## Integration Methods
| Method | Use Case | Overhead |
|--------|----------|----------|
| **Native Bindings** | Performance-critical code | Low |
| **IPC/Pipes** | Separate processes | Medium |
| **gRPC** | Microservices | Medium |
| **HTTP/REST** | Loose coupling | High |
| **Message Queue** | Async processing | Medium |
## Resources
- [Node.js C++ Addons](https://nodejs.org/api/addons.html)
- [PyO3](https://pyo3.rs/)
- [gRPC](https://grpc.io/)
- [Py4J](https://www.py4j.org/)
This skill helps integrate multiple programming languages using FFI, native bindings, gRPC, or language bridges to combine language strengths and reuse existing code. It provides patterns and examples for calling Python, Rust, Java, and C++ components from other runtimes and for building polyglot microservices. Use it to connect high-performance native code, language-specific libraries, and legacy systems into a single architecture.
The skill shows several interop techniques: native bindings for embedding compiled code, subprocess or IPC calls for simple cross-language invocation, language bridges (PyO3, Py4J) for tight integration, and gRPC for networked polyglot services. Each approach includes code patterns, trade-offs, and guidance on serialization, error handling, and performance overhead. It emphasizes choosing the right mechanism for latency, safety, and deployment constraints.
Which method has the lowest runtime overhead?
In-process native bindings (FFI, PyO3, Node native addons) have the lowest overhead. Subprocesses and networked RPCs add latency and serialization cost.
How do I choose between gRPC and subprocess calls?
Use subprocess or IPC for simple, local integrations with process isolation. Use gRPC for scalable, language-agnostic microservices across machines with formal contracts (proto).