JSON to CSV
Convert JSON data to CSV format with customizable options
JSON to CSV
Convert JSON data to CSV format with customizable options
Sample JSON Data
About JSON to CSV Conversion
JSON to CSV conversion transforms structured JSON data into comma-separated values format, making it compatible with spreadsheet applications, databases, and data analysis tools. Perfect for data export, reporting, and integration with business intelligence platforms.
- Convert JSON arrays and objects to CSV format
- Handle nested objects and flatten data structures
- Customize delimiters, headers, and formatting
- Support for large datasets and batch processing
- Preserve data types and handle special characters
Conversion Features
Data Handling
- Flatten nested JSON objects
- Handle arrays and primitive values
- Preserve data relationships
- Auto-generate column headers
- Handle missing or null values
Output Options
- Custom field delimiters (, ; | )
- Configurable quote characters
- Header row customization
- Date format standardization
- Character encoding options
Advertisement
Frequently Asked Questions
How are nested JSON objects handled?
Nested objects are flattened using dot notation (e.g., "user.address.city"). Arrays are either converted to comma-separated strings or split into multiple rows depending on your preference.
What happens with missing data?
Missing or null values are typically converted to empty CSV cells or a placeholder value like "N/A". You can customize how null values are represented in the output.
Can I customize the CSV delimiter?
Yes! You can use comma (,), semicolon (;), tab ( ), or pipe (|) as delimiters. This is useful for different regional CSV standards or specific application requirements.
How large JSON files can I convert?
The tool can handle moderately large JSON files in the browser. For very large datasets (>100MB), consider using server-side processing or splitting the data into smaller chunks.
Conversion Examples
Simple JSON Array:
JSON Input:
[
{
"name": "John Doe",
"age": 30,
"city": "New York"
},
{
"name": "Jane Smith",
"age": 25,
"city": "Los Angeles"
}
]
CSV Output:
name,age,city
"John Doe",30,"New York"
"Jane Smith",25,"Los Angeles"
Nested JSON Objects:
JSON Input:
[
{
"user": {
"name": "Alice",
"contact": {
"email": "alice@example.com",
"phone": "555-0123"
}
},
"order": {
"id": "ORDER-001",
"total": 99.99
}
}
]
CSV Output:
user.name,user.contact.email,user.contact.phone,order.id,order.total
"Alice","alice@example.com","555-0123","ORDER-001",99.99
Sponsored Content
Programming Examples
JavaScript JSON to CSV:
function jsonToCsv(jsonData, options = {}) {
const {
delimiter = ',',
includeHeaders = true,
flattenObjects = true
} = options;
if (!Array.isArray(jsonData) || jsonData.length === 0) {
throw new Error('Input must be a non-empty array');
}
// Flatten nested objects
const flattenObject = (obj, prefix = '') => {
const flattened = {};
for (const key in obj) {
const newKey = prefix ? `${prefix}.${key}` : key;
if (typeof obj[key] === 'object' && obj[key] !== null && !Array.isArray(obj[key])) {
Object.assign(flattened, flattenObject(obj[key], newKey));
} else {
flattened[newKey] = obj[key];
}
}
return flattened;
};
// Process data
const processedData = flattenObjects ?
jsonData.map(item => flattenObject(item)) : jsonData;
// Get all unique headers
const headers = [...new Set(processedData.flatMap(Object.keys))];
// Escape CSV values
const escapeValue = (value) => {
if (value === null || value === undefined) return '';
const str = String(value);
if (str.includes(delimiter) || str.includes('"') || str.includes('
')) {
return '"' + str.replace(/"/g, '""') + '"';
}
return str;
};
// Build CSV
const csvRows = [];
if (includeHeaders) {
csvRows.push(headers.map(escapeValue).join(delimiter));
}
processedData.forEach(row => {
const values = headers.map(header => escapeValue(row[header]));
csvRows.push(values.join(delimiter));
});
return csvRows.join('
');
}
// Usage
const data = [
{ name: 'John', age: 30, city: 'NYC' },
{ name: 'Jane', age: 25, city: 'LA' }
];
console.log(jsonToCsv(data));
Python JSON to CSV:
import json
import csv
import io
from typing import List, Dict, Any
def flatten_json(obj: Dict[str, Any], parent_key: str = '', sep: str = '.') -> Dict[str, Any]:
"""Flatten nested JSON object"""
items = []
for k, v in obj.items():
new_key = f"{parent_key}{sep}{k}" if parent_key else k
if isinstance(v, dict):
items.extend(flatten_json(v, new_key, sep=sep).items())
elif isinstance(v, list):
# Convert list to comma-separated string
items.append((new_key, ', '.join(map(str, v))))
else:
items.append((new_key, v))
return dict(items)
def json_to_csv(json_data: List[Dict[str, Any]],
delimiter: str = ',',
include_headers: bool = True) -> str:
"""Convert JSON data to CSV format"""
if not json_data:
return ""
# Flatten all objects
flattened_data = [flatten_json(item) for item in json_data]
# Get all unique fieldnames
fieldnames = set()
for item in flattened_data:
fieldnames.update(item.keys())
fieldnames = sorted(fieldnames)
# Create CSV in memory
output = io.StringIO()
writer = csv.DictWriter(
output,
fieldnames=fieldnames,
delimiter=delimiter,
quoting=csv.QUOTE_NONNUMERIC
)
if include_headers:
writer.writeheader()
for row in flattened_data:
writer.writerow(row)
return output.getvalue()
# Usage example
data = [
{"name": "Alice", "details": {"age": 30, "city": "Boston"}},
{"name": "Bob", "details": {"age": 25, "city": "Seattle"}}
]
csv_result = json_to_csv(data)
print(csv_result)
Best Practices
- Validate JSON: Ensure JSON is valid before conversion
- Handle Arrays Consistently: Decide how to represent array data (flatten vs. serialize)
- Escape Special Characters: Properly escape quotes, commas, and newlines
- Choose Appropriate Delimiters: Use semicolons for European locales, tabs for TSV
- Preserve Data Types: Consider how numbers, booleans, and dates are represented
- Test with Sample Data: Verify output with your target application
- Consider Memory Usage: Use streaming for very large datasets
Common Use Cases
- API data export to spreadsheets
- Database migration and imports
- Business intelligence reporting
- Data analysis and visualization prep
- CRM and ERP system integration
- Log file analysis and processing
- E-commerce data transformation
- Research data compilation
Advertisement
