rdflib.plugins.sparql package¶
Subpackages¶
- rdflib.plugins.sparql.results package
- Submodules
- rdflib.plugins.sparql.results.csvresults module
- rdflib.plugins.sparql.results.graph module
- rdflib.plugins.sparql.results.jsonresults module
- rdflib.plugins.sparql.results.rdfresults module
- rdflib.plugins.sparql.results.tsvresults module
- rdflib.plugins.sparql.results.txtresults module
- rdflib.plugins.sparql.results.xmlresults module
SPARQLXMLWriter
SPARQLXMLWriter.__dict__
SPARQLXMLWriter.__firstlineno__
SPARQLXMLWriter.__init__()
SPARQLXMLWriter.__module__
SPARQLXMLWriter.__static_attributes__
SPARQLXMLWriter.__weakref__
SPARQLXMLWriter.close()
SPARQLXMLWriter.write_ask()
SPARQLXMLWriter.write_binding()
SPARQLXMLWriter.write_end_result()
SPARQLXMLWriter.write_header()
SPARQLXMLWriter.write_results_header()
SPARQLXMLWriter.write_start_result()
XMLResult
XMLResultParser
XMLResultSerializer
log
parseTerm()
- Module contents
Submodules¶
rdflib.plugins.sparql.aggregates module¶
- class rdflib.plugins.sparql.aggregates.Accumulator(aggregation)[source]¶
Bases:
object
abstract base class for different aggregation functions
- Parameters:
aggregation (
CompValue
) –
- __dict__ = mappingproxy({'__module__': 'rdflib.plugins.sparql.aggregates', '__firstlineno__': 33, '__doc__': 'abstract base class for different aggregation functions', '__init__': <function Accumulator.__init__>, 'dont_care': <function Accumulator.dont_care>, 'use_row': <function Accumulator.use_row>, 'set_value': <function Accumulator.set_value>, '__static_attributes__': ('var', 'expr', 'distinct', 'seen', 'use_row', 'dont_care'), '__dict__': <attribute '__dict__' of 'Accumulator' objects>, '__weakref__': <attribute '__weakref__' of 'Accumulator' objects>, '__annotations__': {'get_value': 'Callable[[], Optional[Literal]]', 'update': "Callable[[FrozenBindings, 'Aggregator'], None]", 'seen': 'Set[Any]'}})¶
- __firstlineno__ = 33¶
- __module__ = 'rdflib.plugins.sparql.aggregates'¶
- __static_attributes__ = ('var', 'expr', 'distinct', 'seen', 'use_row', 'dont_care')¶
- __weakref__¶
list of weak references to the object
- dont_care(row)[source]¶
skips distinct test
- Parameters:
row (
FrozenBindings
) –- Return type:
bool
- set_value(bindings)[source]¶
sets final value in bindings
- Parameters:
bindings (
MutableMapping
[Variable
,Identifier
]) –- Return type:
None
- use_row(row)[source]¶
tests distinct with set
- Parameters:
row (
FrozenBindings
) –- Return type:
bool
- class rdflib.plugins.sparql.aggregates.Aggregator(aggregations)[source]¶
Bases:
object
combines different Accumulator objects
- Parameters:
aggregations (
List
[CompValue
]) –
- __dict__ = mappingproxy({'__module__': 'rdflib.plugins.sparql.aggregates', '__firstlineno__': 280, '__doc__': 'combines different Accumulator objects', 'accumulator_classes': {'Aggregate_Count': <class 'rdflib.plugins.sparql.aggregates.Counter'>, 'Aggregate_Sample': <class 'rdflib.plugins.sparql.aggregates.Sample'>, 'Aggregate_Sum': <class 'rdflib.plugins.sparql.aggregates.Sum'>, 'Aggregate_Avg': <class 'rdflib.plugins.sparql.aggregates.Average'>, 'Aggregate_Min': <class 'rdflib.plugins.sparql.aggregates.Minimum'>, 'Aggregate_Max': <class 'rdflib.plugins.sparql.aggregates.Maximum'>, 'Aggregate_GroupConcat': <class 'rdflib.plugins.sparql.aggregates.GroupConcat'>}, '__init__': <function Aggregator.__init__>, 'update': <function Aggregator.update>, 'get_bindings': <function Aggregator.get_bindings>, '__static_attributes__': ('bindings', 'accumulators', 'accumulator_classes'), '__dict__': <attribute '__dict__' of 'Aggregator' objects>, '__weakref__': <attribute '__weakref__' of 'Aggregator' objects>, '__annotations__': {'bindings': 'Dict[Variable, Identifier]', 'accumulators': 'Dict[str, Accumulator]'}})¶
- __firstlineno__ = 280¶
- __module__ = 'rdflib.plugins.sparql.aggregates'¶
- __static_attributes__ = ('bindings', 'accumulators', 'accumulator_classes')¶
- __weakref__¶
list of weak references to the object
- accumulator_classes = {'Aggregate_Avg': <class 'rdflib.plugins.sparql.aggregates.Average'>, 'Aggregate_Count': <class 'rdflib.plugins.sparql.aggregates.Counter'>, 'Aggregate_GroupConcat': <class 'rdflib.plugins.sparql.aggregates.GroupConcat'>, 'Aggregate_Max': <class 'rdflib.plugins.sparql.aggregates.Maximum'>, 'Aggregate_Min': <class 'rdflib.plugins.sparql.aggregates.Minimum'>, 'Aggregate_Sample': <class 'rdflib.plugins.sparql.aggregates.Sample'>, 'Aggregate_Sum': <class 'rdflib.plugins.sparql.aggregates.Sum'>}¶
- get_bindings()[source]¶
calculate and set last values
- Return type:
Mapping
[Variable
,Identifier
]
- update(row)[source]¶
update all own accumulators
- Parameters:
row (
FrozenBindings
) –- Return type:
None
- class rdflib.plugins.sparql.aggregates.Average(aggregation)[source]¶
Bases:
Accumulator
- Parameters:
aggregation (
CompValue
) –
- __annotations__ = {}¶
- __firstlineno__ = 146¶
- __module__ = 'rdflib.plugins.sparql.aggregates'¶
- __static_attributes__ = ('expr', 'datatype', 'distinct', 'seen', 'sum', 'counter')¶
- update(row, aggregator)[source]¶
- Parameters:
row (
FrozenBindings
) –aggregator (
Aggregator
) –
- Return type:
None
- class rdflib.plugins.sparql.aggregates.Counter(aggregation)[source]¶
Bases:
Accumulator
- Parameters:
aggregation (
CompValue
) –
- __annotations__ = {}¶
- __firstlineno__ = 63¶
- __module__ = 'rdflib.plugins.sparql.aggregates'¶
- __static_attributes__ = ('expr', 'distinct', 'seen', 'eval_full_row', 'eval_row', 'value')¶
- eval_full_row(row)[source]¶
- Parameters:
row (
FrozenBindings
) –- Return type:
- eval_row(row)[source]¶
- Parameters:
row (
FrozenBindings
) –- Return type:
- update(row, aggregator)[source]¶
- Parameters:
row (
FrozenBindings
) –aggregator (
Aggregator
) –
- Return type:
None
- use_row(row)[source]¶
tests distinct with set
- Parameters:
row (
FrozenBindings
) –- Return type:
bool
- class rdflib.plugins.sparql.aggregates.Extremum(aggregation)[source]¶
Bases:
Accumulator
abstract base class for Minimum and Maximum
- Parameters:
aggregation (
CompValue
) –
- __annotations__ = {}¶
- __firstlineno__ = 182¶
- __module__ = 'rdflib.plugins.sparql.aggregates'¶
- __static_attributes__ = ('var', 'expr', 'use_row', 'dont_care', 'value')¶
- set_value(bindings)[source]¶
sets final value in bindings
- Parameters:
bindings (
MutableMapping
[Variable
,Identifier
]) –- Return type:
None
- update(row, aggregator)[source]¶
- Parameters:
row (
FrozenBindings
) –aggregator (
Aggregator
) –
- Return type:
None
- class rdflib.plugins.sparql.aggregates.GroupConcat(aggregation)[source]¶
Bases:
Accumulator
- Parameters:
aggregation (
CompValue
) –
- __annotations__ = {'value': 'List[Literal]'}¶
- __firstlineno__ = 247¶
- __module__ = 'rdflib.plugins.sparql.aggregates'¶
- __static_attributes__ = ('expr', 'distinct', 'seen', 'separator', 'value')¶
- update(row, aggregator)[source]¶
- Parameters:
row (
FrozenBindings
) –aggregator (
Aggregator
) –
- Return type:
None
- class rdflib.plugins.sparql.aggregates.Maximum(aggregation)[source]¶
Bases:
Extremum
- Parameters:
aggregation (
CompValue
) –
- __annotations__ = {}¶
- __firstlineno__ = 220¶
- __module__ = 'rdflib.plugins.sparql.aggregates'¶
- __static_attributes__ = ()¶
- class rdflib.plugins.sparql.aggregates.Minimum(aggregation)[source]¶
Bases:
Extremum
- Parameters:
aggregation (
CompValue
) –
- __annotations__ = {}¶
- __firstlineno__ = 215¶
- __module__ = 'rdflib.plugins.sparql.aggregates'¶
- __static_attributes__ = ()¶
- class rdflib.plugins.sparql.aggregates.Sample(aggregation)[source]¶
Bases:
Accumulator
takes the first eligible value
- __annotations__ = {}¶
- __firstlineno__ = 225¶
- __module__ = 'rdflib.plugins.sparql.aggregates'¶
- __static_attributes__ = ('use_row', 'dont_care', 'expr', 'var')¶
- update(row, aggregator)[source]¶
- Parameters:
row (
FrozenBindings
) –aggregator (
Aggregator
) –
- Return type:
None
- class rdflib.plugins.sparql.aggregates.Sum(aggregation)[source]¶
Bases:
Accumulator
- Parameters:
aggregation (
CompValue
) –
- __annotations__ = {}¶
- __firstlineno__ = 119¶
- __module__ = 'rdflib.plugins.sparql.aggregates'¶
- __static_attributes__ = ('expr', 'datatype', 'distinct', 'seen', 'value')¶
- update(row, aggregator)[source]¶
- Parameters:
row (
FrozenBindings
) –aggregator (
Aggregator
) –
- Return type:
None
rdflib.plugins.sparql.algebra module¶
- rdflib.plugins.sparql.algebra.BGP(triples=None)[source]¶
- Parameters:
triples (
Optional
[List
[Tuple
[Identifier
,Identifier
,Identifier
]]]) –- Return type:
- exception rdflib.plugins.sparql.algebra.ExpressionNotCoveredException[source]¶
Bases:
Exception
- __firstlineno__ = 954¶
- __module__ = 'rdflib.plugins.sparql.algebra'¶
- __static_attributes__ = ()¶
- __weakref__¶
list of weak references to the object
- rdflib.plugins.sparql.algebra.Extend(p, expr, var)[source]¶
- Parameters:
p (
CompValue
) –expr (
Union
[Identifier
,Expr
]) –var (
Variable
) –
- Return type:
- rdflib.plugins.sparql.algebra.Graph(term, graph)[source]¶
- Parameters:
term (
Identifier
) –graph (
CompValue
) –
- Return type:
- exception rdflib.plugins.sparql.algebra.StopTraversal(rv)[source]¶
Bases:
Exception
- Parameters:
rv (
bool
) –
- __firstlineno__ = 396¶
- __module__ = 'rdflib.plugins.sparql.algebra'¶
- __static_attributes__ = ('rv',)¶
- __weakref__¶
list of weak references to the object
- rdflib.plugins.sparql.algebra.analyse(n, children)[source]¶
Some things can be lazily joined. This propegates whether they can up the tree and sets lazy flags for all joins
- Parameters:
n (
Any
) –children (
Any
) –
- Return type:
bool
- rdflib.plugins.sparql.algebra.collectAndRemoveFilters(parts)[source]¶
FILTER expressions apply to the whole group graph pattern in which they appear.
- rdflib.plugins.sparql.algebra.reorderTriples(l_)[source]¶
Reorder triple patterns so that we execute the ones with most bindings first
- Parameters:
l_ (
Iterable
[Tuple
[Identifier
,Identifier
,Identifier
]]) –- Return type:
List
[Tuple
[Identifier
,Identifier
,Identifier
]]
- rdflib.plugins.sparql.algebra.simplify(n)[source]¶
Remove joins to empty BGPs
- Parameters:
n (
Any
) –- Return type:
Optional
[CompValue
]
- rdflib.plugins.sparql.algebra.translateAlgebra(query_algebra)[source]¶
Translates a SPARQL 1.1 algebra tree into the corresponding query string.
- Parameters:
query_algebra (
Query
) – An algebra returned bytranslateQuery
.- Return type:
str
- Returns:
The query form generated from the SPARQL 1.1 algebra tree for SELECT queries.
- rdflib.plugins.sparql.algebra.translateExists(e)[source]¶
Translate the graph pattern used by EXISTS and NOT EXISTS http://www.w3.org/TR/sparql11-query/#sparqlCollectFilters
- rdflib.plugins.sparql.algebra.translatePName(p, prologue)[source]¶
Expand prefixed/relative URIs
- Parameters:
- Return type:
Optional
[Identifier
]
- rdflib.plugins.sparql.algebra.translatePath(p: URIRef) None [source]¶
- rdflib.plugins.sparql.algebra.translatePath(p: CompValue) Path
Translate PropertyPath expressions
- rdflib.plugins.sparql.algebra.translateQuads(quads)[source]¶
- Parameters:
quads (
CompValue
) –- Return type:
Tuple
[List
[Tuple
[Identifier
,Identifier
,Identifier
]],DefaultDict
[str
,List
[Tuple
[Identifier
,Identifier
,Identifier
]]]]
- rdflib.plugins.sparql.algebra.translateQuery(q, base=None, initNs=None)[source]¶
Translate a query-parsetree to a SPARQL Algebra Expression
Return a rdflib.plugins.sparql.sparql.Query object
- Parameters:
q (
ParseResults
) –base (
Optional
[str
]) –initNs (
Optional
[Mapping
[str
,Any
]]) –
- Return type:
- rdflib.plugins.sparql.algebra.translateUpdate(q, base=None, initNs=None)[source]¶
Returns a list of SPARQL Update Algebra expressions
- rdflib.plugins.sparql.algebra.traverse(tree, visitPre=<function <lambda>>, visitPost=<function <lambda>>, complete=None)[source]¶
Traverse tree, visit each node with visit function visit function may raise StopTraversal to stop traversal if complete!=None, it is returned on complete traversal, otherwise the transformed tree is returned
- Parameters:
visitPre (
Callable
[[Any
],Any
]) –visitPost (
Callable
[[Any
],Any
]) –complete (
Optional
[bool
]) –
- Return type:
Any
- rdflib.plugins.sparql.algebra.triples(l)[source]¶
- Parameters:
l (
Union
[List
[List
[Identifier
]],List
[Tuple
[Identifier
,Identifier
,Identifier
]]]) –- Return type:
List
[Tuple
[Identifier
,Identifier
,Identifier
]]
rdflib.plugins.sparql.datatypes module¶
rdflib.plugins.sparql.evaluate module¶
- rdflib.plugins.sparql.evaluate.evalAggregateJoin(ctx, agg)[source]¶
- Parameters:
ctx (
QueryContext
) –agg (
CompValue
) –
- Return type:
Generator
[FrozenBindings
,None
,None
]
- rdflib.plugins.sparql.evaluate.evalAskQuery(ctx, query)[source]¶
- Parameters:
ctx (
QueryContext
) –query (
CompValue
) –
- Return type:
Mapping
[str
,Union
[str
,bool
]]
- rdflib.plugins.sparql.evaluate.evalBGP(ctx, bgp)[source]¶
A basic graph pattern
- Parameters:
ctx (
QueryContext
) –bgp (
List
[Tuple
[Identifier
,Identifier
,Identifier
]]) –
- Return type:
Generator
[FrozenBindings
,None
,None
]
- rdflib.plugins.sparql.evaluate.evalConstructQuery(ctx, query)[source]¶
- Parameters:
ctx (
QueryContext
) –query (
CompValue
) –
- Return type:
Mapping
[str
,Union
[str
,Graph
]]
- rdflib.plugins.sparql.evaluate.evalDescribeQuery(ctx, query)[source]¶
- Parameters:
ctx (
QueryContext
) –- Return type:
Dict
[str
,Union
[str
,Graph
]]
- rdflib.plugins.sparql.evaluate.evalDistinct(ctx, part)[source]¶
- Parameters:
ctx (
QueryContext
) –part (
CompValue
) –
- Return type:
Generator
[FrozenBindings
,None
,None
]
- rdflib.plugins.sparql.evaluate.evalExtend(ctx, extend)[source]¶
- Parameters:
ctx (
QueryContext
) –extend (
CompValue
) –
- Return type:
Generator
[FrozenBindings
,None
,None
]
- rdflib.plugins.sparql.evaluate.evalFilter(ctx, part)[source]¶
- Parameters:
ctx (
QueryContext
) –part (
CompValue
) –
- Return type:
Generator
[FrozenBindings
,None
,None
]
- rdflib.plugins.sparql.evaluate.evalGraph(ctx, part)[source]¶
- Parameters:
ctx (
QueryContext
) –part (
CompValue
) –
- Return type:
Generator
[FrozenBindings
,None
,None
]
- rdflib.plugins.sparql.evaluate.evalGroup(ctx, group)[source]¶
http://www.w3.org/TR/sparql11-query/#defn_algGroup
- Parameters:
ctx (
QueryContext
) –group (
CompValue
) –
- rdflib.plugins.sparql.evaluate.evalJoin(ctx, join)[source]¶
- Parameters:
ctx (
QueryContext
) –join (
CompValue
) –
- Return type:
Generator
[FrozenDict
,None
,None
]
- rdflib.plugins.sparql.evaluate.evalLazyJoin(ctx, join)[source]¶
A lazy join will push the variables bound in the first part to the second part, essentially doing the join implicitly hopefully evaluating much fewer triples
- Parameters:
ctx (
QueryContext
) –join (
CompValue
) –
- Return type:
Generator
[FrozenBindings
,None
,None
]
- rdflib.plugins.sparql.evaluate.evalLeftJoin(ctx, join)[source]¶
- Parameters:
ctx (
QueryContext
) –join (
CompValue
) –
- Return type:
Generator
[FrozenBindings
,None
,None
]
- rdflib.plugins.sparql.evaluate.evalMinus(ctx, minus)[source]¶
- Parameters:
ctx (
QueryContext
) –minus (
CompValue
) –
- Return type:
Generator
[FrozenDict
,None
,None
]
- rdflib.plugins.sparql.evaluate.evalMultiset(ctx, part)[source]¶
- Parameters:
ctx (
QueryContext
) –part (
CompValue
) –
- rdflib.plugins.sparql.evaluate.evalOrderBy(ctx, part)[source]¶
- Parameters:
ctx (
QueryContext
) –part (
CompValue
) –
- Return type:
Generator
[FrozenBindings
,None
,None
]
- rdflib.plugins.sparql.evaluate.evalPart(ctx, part)[source]¶
- Parameters:
ctx (
QueryContext
) –part (
CompValue
) –
- Return type:
Any
- rdflib.plugins.sparql.evaluate.evalProject(ctx, project)[source]¶
- Parameters:
ctx (
QueryContext
) –project (
CompValue
) –
- rdflib.plugins.sparql.evaluate.evalQuery(graph, query, initBindings=None, base=None)[source]¶
Caution
This method can access indirectly requested network endpoints, for example, query processing will attempt to access network endpoints specified in
SERVICE
directives.When processing untrusted or potentially malicious queries, measures should be taken to restrict network and file access.
For information on available security measures, see the RDFLib Security Considerations documentation.
- Parameters:
graph (
Graph
) –query (
Query
) –initBindings (
Optional
[Mapping
[str
,Identifier
]]) –base (
Optional
[str
]) –
- Return type:
Mapping
[Any
,Any
]
- rdflib.plugins.sparql.evaluate.evalReduced(ctx, part)[source]¶
apply REDUCED to result
REDUCED is not as strict as DISTINCT, but if the incoming rows were sorted it should produce the same result with limited extra memory and time per incoming row.
- Parameters:
ctx (
QueryContext
) –part (
CompValue
) –
- Return type:
Generator
[FrozenBindings
,None
,None
]
- rdflib.plugins.sparql.evaluate.evalSelectQuery(ctx, query)[source]¶
- Parameters:
ctx (
QueryContext
) –query (
CompValue
) –
- Return type:
Mapping
[str
,Union
[str
,List
[Variable
],Iterable
[FrozenDict
]]]
- rdflib.plugins.sparql.evaluate.evalServiceQuery(ctx, part)[source]¶
- Parameters:
ctx (
QueryContext
) –part (
CompValue
) –
- rdflib.plugins.sparql.evaluate.evalSlice(ctx, slice)[source]¶
- Parameters:
ctx (
QueryContext
) –slice (
CompValue
) –
- rdflib.plugins.sparql.evaluate.evalUnion(ctx, union)[source]¶
- Parameters:
ctx (
QueryContext
) –union (
CompValue
) –
- Return type:
Iterable
[FrozenBindings
]
- rdflib.plugins.sparql.evaluate.evalValues(ctx, part)[source]¶
- Parameters:
ctx (
QueryContext
) –part (
CompValue
) –
- Return type:
Generator
[FrozenBindings
,None
,None
]
rdflib.plugins.sparql.evalutils module¶
rdflib.plugins.sparql.operators module¶
- rdflib.plugins.sparql.operators.AdditiveExpression(e, ctx)[source]¶
- Parameters:
e (
Expr
) –ctx (
Union
[QueryContext
,FrozenBindings
]) –
- Return type:
- rdflib.plugins.sparql.operators.Builtin_BNODE(expr, ctx)[source]¶
http://www.w3.org/TR/sparql11-query/#func-bnode
- Return type:
- rdflib.plugins.sparql.operators.Builtin_COALESCE(expr, ctx)[source]¶
http://www.w3.org/TR/sparql11-query/#func-coalesce
- Parameters:
expr (
Expr
) –
- rdflib.plugins.sparql.operators.Builtin_DATATYPE(e, ctx)[source]¶
- Parameters:
e (
Expr
) –- Return type:
Optional
[str
]
- rdflib.plugins.sparql.operators.Builtin_EXISTS(e, ctx)[source]¶
- Parameters:
e (
Expr
) –ctx (
FrozenBindings
) –
- Return type:
- rdflib.plugins.sparql.operators.Builtin_IF(expr, ctx)[source]¶
http://www.w3.org/TR/sparql11-query/#func-if
- Parameters:
expr (
Expr
) –
- rdflib.plugins.sparql.operators.Builtin_IRI(expr, ctx)[source]¶
http://www.w3.org/TR/sparql11-query/#func-iri
- Parameters:
expr (
Expr
) –ctx (
FrozenBindings
) –
- Return type:
- rdflib.plugins.sparql.operators.Builtin_LANG(e, ctx)[source]¶
http://www.w3.org/TR/sparql11-query/#func-lang
Returns the language tag of ltrl, if it has one. It returns “” if ltrl has no language tag. Note that the RDF data model does not include literals with an empty language tag.
- rdflib.plugins.sparql.operators.Builtin_REGEX(expr, ctx)[source]¶
http://www.w3.org/TR/sparql11-query/#func-regex Invokes the XPath fn:matches function to match text against a regular expression pattern. The regular expression language is defined in XQuery 1.0 and XPath 2.0 Functions and Operators section 7.6.1 Regular Expression Syntax
- rdflib.plugins.sparql.operators.Builtin_STRUUID(expr, ctx)[source]¶
http://www.w3.org/TR/sparql11-query/#func-strdt
- Return type:
- rdflib.plugins.sparql.operators.Builtin_isBLANK(expr, ctx)[source]¶
- Parameters:
expr (
Expr
) –ctx (
FrozenBindings
) –
- Return type:
- rdflib.plugins.sparql.operators.ConditionalAndExpression(e, ctx)[source]¶
- Parameters:
e (
Expr
) –ctx (
Union
[QueryContext
,FrozenBindings
]) –
- Return type:
- rdflib.plugins.sparql.operators.ConditionalOrExpression(e, ctx)[source]¶
- Parameters:
e (
Expr
) –ctx (
Union
[QueryContext
,FrozenBindings
]) –
- Return type:
- rdflib.plugins.sparql.operators.EBV(rt: Literal) bool [source]¶
- rdflib.plugins.sparql.operators.EBV(rt: Variable | IdentifiedNode | SPARQLError | Expr) NoReturn
- rdflib.plugins.sparql.operators.EBV(rt: Identifier | SPARQLError | Expr) bool | NoReturn
Effective Boolean Value (EBV)
If the argument is a typed literal with a datatype of xsd:boolean, the EBV is the value of that argument.
If the argument is a plain literal or a typed literal with a datatype of xsd:string, the EBV is false if the operand value has zero length; otherwise the EBV is true.
If the argument is a numeric type or a typed literal with a datatype derived from a numeric type, the EBV is false if the operand value is NaN or is numerically equal to zero; otherwise the EBV is true.
All other arguments, including unbound arguments, produce a type error.
- Parameters:
rt (
Union
[Identifier
,SPARQLError
,Expr
]) –- Return type:
bool
- rdflib.plugins.sparql.operators.Function(e, ctx)[source]¶
Custom functions and casts
- Parameters:
e (
Expr
) –ctx (
FrozenBindings
) –
- Return type:
- rdflib.plugins.sparql.operators.MultiplicativeExpression(e, ctx)[source]¶
- Parameters:
e (
Expr
) –ctx (
Union
[QueryContext
,FrozenBindings
]) –
- Return type:
- rdflib.plugins.sparql.operators.RelationalExpression(e, ctx)[source]¶
- Parameters:
e (
Expr
) –ctx (
Union
[QueryContext
,FrozenBindings
]) –
- Return type:
- rdflib.plugins.sparql.operators.UnaryMinus(expr, ctx)[source]¶
- Parameters:
expr (
Expr
) –ctx (
FrozenBindings
) –
- Return type:
- rdflib.plugins.sparql.operators.UnaryNot(expr, ctx)[source]¶
- Parameters:
expr (
Expr
) –ctx (
FrozenBindings
) –
- Return type:
- rdflib.plugins.sparql.operators.UnaryPlus(expr, ctx)[source]¶
- Parameters:
expr (
Expr
) –ctx (
FrozenBindings
) –
- Return type:
- rdflib.plugins.sparql.operators.calculateDuration(obj1, obj2)[source]¶
returns the duration Literal between two datetime
- Parameters:
obj1 (
Union
[date
,datetime
]) –obj2 (
Union
[date
,datetime
]) –
- Return type:
- rdflib.plugins.sparql.operators.calculateFinalDateTime(obj1, dt1, obj2, dt2, operation)[source]¶
Calculates the final dateTime/date/time resultant after addition/ subtraction of duration/dayTimeDuration/yearMonthDuration
- rdflib.plugins.sparql.operators.custom_function(uri, override=False, raw=False)[source]¶
Decorator version of
register_custom_function()
.- Parameters:
uri (
URIRef
) –override (
bool
) –raw (
bool
) –
- Return type:
Callable
[[Callable
[[Expr
,FrozenBindings
],Node
]],Callable
[[Expr
,FrozenBindings
],Node
]]
- rdflib.plugins.sparql.operators.dateTimeObjects(expr)[source]¶
return a dataTime/date/time/duration/dayTimeDuration/yearMonthDuration python objects from a literal
- Parameters:
expr (
Literal
) –- Return type:
Any
- rdflib.plugins.sparql.operators.datetime(e)[source]¶
- Parameters:
e (
Literal
) –- Return type:
datetime
- rdflib.plugins.sparql.operators.default_cast(e, ctx)[source]¶
- Parameters:
e (
Expr
) –ctx (
FrozenBindings
) –
- Return type:
- rdflib.plugins.sparql.operators.isCompatibleDateTimeDatatype(obj1, dt1, obj2, dt2)[source]¶
Returns a boolean indicating if first object is compatible with operation(+/-) over second object.
- rdflib.plugins.sparql.operators.numeric(expr)[source]¶
return a number from a literal http://www.w3.org/TR/xpath20/#promotion
or TypeError
- Parameters:
expr (
Literal
) –- Return type:
Any
- rdflib.plugins.sparql.operators.register_custom_function(uri, func, override=False, raw=False)[source]¶
Register a custom SPARQL function.
By default, the function will be passed the RDF terms in the argument list. If raw is True, the function will be passed an Expression and a Context.
The function must return an RDF term, or raise a SparqlError.
- Parameters:
uri (
URIRef
) –func (
Callable
[[Expr
,FrozenBindings
],Node
]) –override (
bool
) –raw (
bool
) –
- Return type:
None
- rdflib.plugins.sparql.operators.string(s)[source]¶
Make sure the passed thing is a string literal i.e. plain literal, xsd:string literal or lang-tagged literal
- rdflib.plugins.sparql.operators.unregister_custom_function(uri, func=None)[source]¶
The ‘func’ argument is included for compatibility with existing code. A previous implementation checked that the function associated with the given uri was actually ‘func’, but this is not necessary as the uri should uniquely identify the function.
- Parameters:
uri (
URIRef
) –func (
Optional
[Callable
[...
,Any
]]) –
- Return type:
None
rdflib.plugins.sparql.parser module¶
SPARQL 1.1 Parser
based on pyparsing
- rdflib.plugins.sparql.parser.expandBNodeTriples(terms)[source]¶
expand [ ?p ?o ] syntax for implicit bnodes
- Parameters:
terms (
ParseResults
) –- Return type:
List
[Any
]
- rdflib.plugins.sparql.parser.expandCollection(terms)[source]¶
expand ( 1 2 3 ) notation for collections
- Parameters:
terms (
ParseResults
) –- Return type:
List
[List
[Any
]]
- rdflib.plugins.sparql.parser.expandTriples(terms)[source]¶
Expand ; and , syntax for repeat predicates, subjects
- Parameters:
terms (
ParseResults
) –- Return type:
List
[Any
]
- rdflib.plugins.sparql.parser.expandUnicodeEscapes(q)[source]¶
The syntax of the SPARQL Query Language is expressed over code points in Unicode [UNICODE]. The encoding is always UTF-8 [RFC3629]. Unicode code points may also be expressed using an uXXXX (U+0 to U+FFFF) or UXXXXXXXX syntax (for U+10000 onwards) where X is a hexadecimal digit [0-9A-F]
- Parameters:
q (
str
) –- Return type:
str
- rdflib.plugins.sparql.parser.parseQuery(q)[source]¶
- Parameters:
q (
Union
[str
,bytes
,TextIO
,BinaryIO
]) –- Return type:
ParseResults
- rdflib.plugins.sparql.parser.parseUpdate(q)[source]¶
- Parameters:
q (
Union
[str
,bytes
,TextIO
,BinaryIO
]) –- Return type:
rdflib.plugins.sparql.parserutils module¶
- class rdflib.plugins.sparql.parserutils.Comp(name, expr)[source]¶
Bases:
TokenConverter
A pyparsing token for grouping together things with a label Any sub-tokens that are not Params will be ignored.
Returns CompValue / Expr objects - depending on whether evalFn is set.
- Parameters:
name (
str
) –expr (
ParserElement
) –
- __abstractmethods__ = frozenset({})¶
- __annotations__ = {'DEFAULT_WHITE_CHARS': 'str', '_defaultName': 'typing.Optional[str]', '_literalStringClass': 'type', 'customName': 'str', 'evalfn': 'Optional[Callable[[Any, Any], Any]]', 'failAction': 'typing.Optional[ParseFailAction]', 'ignoreExprs': "List['ParserElement']", 'parseAction': 'List[ParseAction]', 'recursion_memos': "typing.Dict[Tuple[int, 'Forward', bool], Tuple[int, Union[ParseResults, Exception]]]", 'resultsName': 'str', 'suppress_warnings_': 'List[Diagnostics]', 'verbose_stacktrace': 'bool'}¶
- __firstlineno__ = 235¶
- __module__ = 'rdflib.plugins.sparql.parserutils'¶
- __slotnames__ = []¶
- __static_attributes__ = ('expr', 'evalfn', 'name')¶
- class rdflib.plugins.sparql.parserutils.CompValue(name, **values)[source]¶
Bases:
OrderedDict
The result of parsing a Comp Any included Params are available as Dict keys or as attributes
- Parameters:
name (
str
) –
- __firstlineno__ = 152¶
- __module__ = 'rdflib.plugins.sparql.parserutils'¶
- __static_attributes__ = ('ctx', 'name')¶
- class rdflib.plugins.sparql.parserutils.Expr(name, evalfn=None, **values)[source]¶
Bases:
CompValue
A CompValue that is evaluatable
- Parameters:
name (
str
) –evalfn (
Optional
[Callable
[[Any
,Any
],Any
]]) –
- __annotations__ = {}¶
- __firstlineno__ = 207¶
- __init__(name, evalfn=None, **values)[source]¶
- Parameters:
name (
str
) –evalfn (
Optional
[Callable
[[Any
,Any
],Any
]]) –
- __module__ = 'rdflib.plugins.sparql.parserutils'¶
- __static_attributes__ = ('ctx', '_evalfn')¶
- eval(ctx={})[source]¶
- Parameters:
ctx (
Any
) –- Return type:
Union
[SPARQLError
,Any
]
- class rdflib.plugins.sparql.parserutils.Param(name, expr, isList=False)[source]¶
Bases:
TokenConverter
A pyparsing token for labelling a part of the parse-tree if isList is true repeat occurrences of ParamList have their values merged in a list
- Parameters:
name (
str
) –isList (
bool
) –
- __abstractmethods__ = frozenset({})¶
- __annotations__ = {'DEFAULT_WHITE_CHARS': 'str', '_defaultName': 'typing.Optional[str]', '_literalStringClass': 'type', 'customName': 'str', 'failAction': 'typing.Optional[ParseFailAction]', 'ignoreExprs': "List['ParserElement']", 'parseAction': 'List[ParseAction]', 'recursion_memos': "typing.Dict[Tuple[int, 'Forward', bool], Tuple[int, Union[ParseResults, Exception]]]", 'resultsName': 'str', 'suppress_warnings_': 'List[Diagnostics]', 'verbose_stacktrace': 'bool'}¶
- __firstlineno__ = 123¶
- __module__ = 'rdflib.plugins.sparql.parserutils'¶
- __slotnames__ = []¶
- __static_attributes__ = ('isList', 'name', 'postParse2')¶
- class rdflib.plugins.sparql.parserutils.ParamList(name, expr)[source]¶
Bases:
Param
A shortcut for a Param with isList=True
- Parameters:
name (
str
) –
- __abstractmethods__ = frozenset({})¶
- __annotations__ = {'DEFAULT_WHITE_CHARS': 'str', '_defaultName': 'typing.Optional[str]', '_literalStringClass': 'type', 'customName': 'str', 'failAction': 'typing.Optional[ParseFailAction]', 'ignoreExprs': "List['ParserElement']", 'parseAction': 'List[ParseAction]', 'recursion_memos': "typing.Dict[Tuple[int, 'Forward', bool], Tuple[int, Union[ParseResults, Exception]]]", 'resultsName': 'str', 'suppress_warnings_': 'List[Diagnostics]', 'verbose_stacktrace': 'bool'}¶
- __firstlineno__ = 140¶
- __module__ = 'rdflib.plugins.sparql.parserutils'¶
- __static_attributes__ = ()¶
- class rdflib.plugins.sparql.parserutils.ParamValue(name, tokenList, isList)[source]¶
Bases:
object
The result of parsing a Param This just keeps the name/value All cleverness is in the CompValue
- Parameters:
name (
str
) –tokenList (
Union
[List
[Any
],ParseResults
]) –isList (
bool
) –
- __dict__ = mappingproxy({'__module__': 'rdflib.plugins.sparql.parserutils', '__firstlineno__': 102, '__doc__': '\nThe result of parsing a Param\nThis just keeps the name/value\nAll cleverness is in the CompValue\n', '__init__': <function ParamValue.__init__>, '__str__': <function ParamValue.__str__>, '__static_attributes__': ('isList', 'name', 'tokenList'), '__dict__': <attribute '__dict__' of 'ParamValue' objects>, '__weakref__': <attribute '__weakref__' of 'ParamValue' objects>, '__annotations__': {}})¶
- __firstlineno__ = 102¶
- __init__(name, tokenList, isList)[source]¶
- Parameters:
name (
str
) –tokenList (
Union
[List
[Any
],ParseResults
]) –isList (
bool
) –
- __module__ = 'rdflib.plugins.sparql.parserutils'¶
- __static_attributes__ = ('isList', 'name', 'tokenList')¶
- __weakref__¶
list of weak references to the object
- rdflib.plugins.sparql.parserutils.prettify_parsetree(t, indent='', depth=0)[source]¶
- Parameters:
t (
ParseResults
) –indent (
str
) –depth (
int
) –
- Return type:
str
- rdflib.plugins.sparql.parserutils.value(ctx, val, variables=False, errors=False)[source]¶
utility function for evaluating something…
Variables will be looked up in the context Normally, non-bound vars is an error, set variables=True to return unbound vars
Normally, an error raises the error, set errors=True to return error
- Parameters:
ctx (
FrozenBindings
) –val (
Any
) –variables (
bool
) –errors (
bool
) –
- Return type:
Any
rdflib.plugins.sparql.processor module¶
Code for tying SPARQL Engine into RDFLib
These should be automatically registered with RDFLib
- class rdflib.plugins.sparql.processor.SPARQLProcessor(graph)[source]¶
Bases:
Processor
- __annotations__ = {}¶
- __firstlineno__ = 108¶
- __module__ = 'rdflib.plugins.sparql.processor'¶
- __static_attributes__ = ('graph',)¶
- query(strOrQuery, initBindings=None, initNs=None, base=None, DEBUG=False)[source]¶
Evaluate a query with the given initial bindings, and initial namespaces. The given base is used to resolve relative URIs in the query and will be overridden by any BASE given in the query.
Caution
This method can access indirectly requested network endpoints, for example, query processing will attempt to access network endpoints specified in
SERVICE
directives.When processing untrusted or potentially malicious queries, measures should be taken to restrict network and file access.
For information on available security measures, see the RDFLib Security Considerations documentation.
- Parameters:
strOrQuery (
Union
[str
,Query
]) –initBindings (
Optional
[Mapping
[str
,Identifier
]]) –initNs (
Optional
[Mapping
[str
,Any
]]) –base (
Optional
[str
]) –DEBUG (
bool
) –
- Return type:
Mapping
[str
,Any
]
- class rdflib.plugins.sparql.processor.SPARQLResult(res)[source]¶
Bases:
Result
- Parameters:
res (
Mapping
[str
,Any
]) –
- __annotations__ = {}¶
- __firstlineno__ = 67¶
- __module__ = 'rdflib.plugins.sparql.processor'¶
- __static_attributes__ = ('askAnswer', 'vars', 'graph', 'bindings')¶
- class rdflib.plugins.sparql.processor.SPARQLUpdateProcessor(graph)[source]¶
Bases:
UpdateProcessor
- __annotations__ = {}¶
- __firstlineno__ = 77¶
- __module__ = 'rdflib.plugins.sparql.processor'¶
- __static_attributes__ = ('graph',)¶
- update(strOrQuery, initBindings=None, initNs=None)[source]¶
Caution
This method can access indirectly requested network endpoints, for example, query processing will attempt to access network endpoints specified in
SERVICE
directives.When processing untrusted or potentially malicious queries, measures should be taken to restrict network and file access.
For information on available security measures, see the RDFLib Security Considerations documentation.
- Parameters:
strOrQuery (
Union
[str
,Update
]) –initBindings (
Optional
[Mapping
[str
,Identifier
]]) –initNs (
Optional
[Mapping
[str
,Any
]]) –
- Return type:
None
- rdflib.plugins.sparql.processor.prepareQuery(queryString, initNs=None, base=None)[source]¶
Parse and translate a SPARQL Query
- Parameters:
queryString (
str
) –initNs (
Optional
[Mapping
[str
,Any
]]) –base (
Optional
[str
]) –
- Return type:
- rdflib.plugins.sparql.processor.prepareUpdate(updateString, initNs=None, base=None)[source]¶
Parse and translate a SPARQL Update
- Parameters:
updateString (
str
) –initNs (
Optional
[Mapping
[str
,Any
]]) –base (
Optional
[str
]) –
- Return type:
- rdflib.plugins.sparql.processor.processUpdate(graph, updateString, initBindings=None, initNs=None, base=None)[source]¶
Process a SPARQL Update Request returns Nothing on success or raises Exceptions on error
- Parameters:
graph (
Graph
) –updateString (
str
) –initBindings (
Optional
[Mapping
[str
,Identifier
]]) –initNs (
Optional
[Mapping
[str
,Any
]]) –base (
Optional
[str
]) –
- Return type:
None
rdflib.plugins.sparql.sparql module¶
- exception rdflib.plugins.sparql.sparql.AlreadyBound[source]¶
Bases:
SPARQLError
Raised when trying to bind a variable that is already bound!
- __firstlineno__ = 47¶
- __module__ = 'rdflib.plugins.sparql.sparql'¶
- __static_attributes__ = ()¶
- class rdflib.plugins.sparql.sparql.Bindings(outer=None, d=[])[source]¶
Bases:
MutableMapping
A single level of a stack of variable-value bindings. Each dict keeps a reference to the dict below it, any failed lookup is propegated back
In python 3.3 this could be a collections.ChainMap
- Parameters:
outer (
Optional
[Bindings
]) –
- __abstractmethods__ = frozenset({})¶
- __dict__ = mappingproxy({'__module__': 'rdflib.plugins.sparql.sparql', '__firstlineno__': 59, '__doc__': '\n\nA single level of a stack of variable-value bindings.\nEach dict keeps a reference to the dict below it,\nany failed lookup is propegated back\n\nIn python 3.3 this could be a collections.ChainMap\n', '__init__': <function Bindings.__init__>, '__getitem__': <function Bindings.__getitem__>, '__contains__': <function Bindings.__contains__>, '__setitem__': <function Bindings.__setitem__>, '__delitem__': <function Bindings.__delitem__>, '__len__': <function Bindings.__len__>, '__iter__': <function Bindings.__iter__>, '__str__': <function Bindings.__str__>, '__repr__': <function Bindings.__repr__>, '__static_attributes__': ('_d', 'outer'), '__dict__': <attribute '__dict__' of 'Bindings' objects>, '__weakref__': <attribute '__weakref__' of 'Bindings' objects>, '__abstractmethods__': frozenset(), '_abc_impl': <_abc._abc_data object>, '__annotations__': {'_d': 'Dict[str, str]'}})¶
- __firstlineno__ = 59¶
- __module__ = 'rdflib.plugins.sparql.sparql'¶
- __static_attributes__ = ('_d', 'outer')¶
- __weakref__¶
list of weak references to the object
- class rdflib.plugins.sparql.sparql.FrozenBindings(ctx, *args, **kwargs)[source]¶
Bases:
FrozenDict
- Parameters:
ctx (
QueryContext
) –
- __abstractmethods__ = frozenset({})¶
- __annotations__ = {'_d': 'Dict[Identifier, Identifier]', '_hash': 'Optional[int]'}¶
- __firstlineno__ = 179¶
- __getitem__(key)[source]¶
- Parameters:
key (
Union
[Identifier
,str
]) –- Return type:
- __init__(ctx, *args, **kwargs)[source]¶
- Parameters:
ctx (
QueryContext
) –
- __module__ = 'rdflib.plugins.sparql.sparql'¶
- __static_attributes__ = ('ctx', '_d')¶
- property bnodes: Mapping[Identifier, BNode]¶
- Return type:
Mapping
[Identifier
,BNode
]
- forget(before, _except=None)[source]¶
return a frozen dict only of bindings made in self since before
- Parameters:
before (
QueryContext
) –_except (
Optional
[Container
[Variable
]]) –
- Return type:
- merge(other)[source]¶
- Parameters:
other (
Mapping
[Identifier
,Identifier
]) –- Return type:
- property now: datetime¶
- Return type:
datetime
- class rdflib.plugins.sparql.sparql.FrozenDict(*args, **kwargs)[source]¶
Bases:
Mapping
An immutable hashable dict
Taken from http://stackoverflow.com/a/2704866/81121
- Parameters:
args (
Any
) –kwargs (
Any
) –
- __abstractmethods__ = frozenset({})¶
- __annotations__ = {'_d': 'Dict[Identifier, Identifier]', '_hash': 'Optional[int]'}¶
- __dict__ = mappingproxy({'__module__': 'rdflib.plugins.sparql.sparql', '__firstlineno__': 117, '__doc__': '\nAn immutable hashable dict\n\nTaken from http://stackoverflow.com/a/2704866/81121\n\n', '__init__': <function FrozenDict.__init__>, '__iter__': <function FrozenDict.__iter__>, '__len__': <function FrozenDict.__len__>, '__getitem__': <function FrozenDict.__getitem__>, '__hash__': <function FrozenDict.__hash__>, 'project': <function FrozenDict.project>, 'disjointDomain': <function FrozenDict.disjointDomain>, 'compatible': <function FrozenDict.compatible>, 'merge': <function FrozenDict.merge>, '__str__': <function FrozenDict.__str__>, '__repr__': <function FrozenDict.__repr__>, '__static_attributes__': ('_d', '_hash'), '__dict__': <attribute '__dict__' of 'FrozenDict' objects>, '__weakref__': <attribute '__weakref__' of 'FrozenDict' objects>, '__abstractmethods__': frozenset(), '_abc_impl': <_abc._abc_data object>, '__annotations__': {'_d': 'Dict[Identifier, Identifier]', '_hash': 'Optional[int]'}})¶
- __firstlineno__ = 117¶
- __getitem__(key)[source]¶
- Parameters:
key (
Identifier
) –- Return type:
- __module__ = 'rdflib.plugins.sparql.sparql'¶
- __static_attributes__ = ('_d', '_hash')¶
- __weakref__¶
list of weak references to the object
- compatible(other)[source]¶
- Parameters:
other (
Mapping
[Identifier
,Identifier
]) –- Return type:
bool
- disjointDomain(other)[source]¶
- Parameters:
other (
Mapping
[Identifier
,Identifier
]) –- Return type:
bool
- merge(other)[source]¶
- Parameters:
other (
Mapping
[Identifier
,Identifier
]) –- Return type:
- exception rdflib.plugins.sparql.sparql.NotBoundError(msg=None)[source]¶
Bases:
SPARQLError
- Parameters:
msg (
Optional
[str
]) –
- __annotations__ = {}¶
- __firstlineno__ = 42¶
- __module__ = 'rdflib.plugins.sparql.sparql'¶
- __static_attributes__ = ()¶
- class rdflib.plugins.sparql.sparql.Prologue[source]¶
Bases:
object
A class for holding prefixing bindings and base URI information
- __dict__ = mappingproxy({'__module__': 'rdflib.plugins.sparql.sparql', '__firstlineno__': 411, '__doc__': '\nA class for holding prefixing bindings and base URI information\n', '__init__': <function Prologue.__init__>, 'resolvePName': <function Prologue.resolvePName>, 'bind': <function Prologue.bind>, 'absolutize': <function Prologue.absolutize>, '__static_attributes__': ('namespace_manager', 'base'), '__dict__': <attribute '__dict__' of 'Prologue' objects>, '__weakref__': <attribute '__weakref__' of 'Prologue' objects>, '__annotations__': {'base': 'Optional[str]'}})¶
- __firstlineno__ = 411¶
- __module__ = 'rdflib.plugins.sparql.sparql'¶
- __static_attributes__ = ('namespace_manager', 'base')¶
- __weakref__¶
list of weak references to the object
- class rdflib.plugins.sparql.sparql.Query(prologue, algebra)[source]¶
Bases:
object
A parsed and translated query
- __dict__ = mappingproxy({'__module__': 'rdflib.plugins.sparql.sparql', '__firstlineno__': 453, '__doc__': '\nA parsed and translated query\n', '__init__': <function Query.__init__>, '__static_attributes__': ('prologue', 'algebra'), '__dict__': <attribute '__dict__' of 'Query' objects>, '__weakref__': <attribute '__weakref__' of 'Query' objects>, '__annotations__': {'_original_args': 'Tuple[str, Mapping[str, str], Optional[str]]'}})¶
- __firstlineno__ = 453¶
- __module__ = 'rdflib.plugins.sparql.sparql'¶
- __static_attributes__ = ('prologue', 'algebra')¶
- __weakref__¶
list of weak references to the object
- class rdflib.plugins.sparql.sparql.QueryContext(graph=None, bindings=None, initBindings=None)[source]¶
Bases:
object
Query context - passed along when evaluating the query
- Parameters:
graph (
Optional
[Graph
]) –bindings (
Union
[Bindings
,FrozenBindings
,List
[Any
],None
]) –initBindings (
Optional
[Mapping
[str
,Identifier
]]) –
- __dict__ = mappingproxy({'__module__': 'rdflib.plugins.sparql.sparql', '__firstlineno__': 249, '__doc__': '\nQuery context - passed along when evaluating the query\n', '__init__': <function QueryContext.__init__>, 'now': <property object>, 'clone': <function QueryContext.clone>, 'dataset': <property object>, 'load': <function QueryContext.load>, '__getitem__': <function QueryContext.__getitem__>, 'get': <function QueryContext.get>, 'solution': <function QueryContext.solution>, '__setitem__': <function QueryContext.__setitem__>, 'pushGraph': <function QueryContext.pushGraph>, 'push': <function QueryContext.push>, 'clean': <function QueryContext.clean>, 'thaw': <function QueryContext.thaw>, '__static_attributes__': ('initBindings', 'prologue', 'bindings', 'dataset', 'graph', 'bnodes', '_dataset', '_now'), '__dict__': <attribute '__dict__' of 'QueryContext' objects>, '__weakref__': <attribute '__weakref__' of 'QueryContext' objects>, '__annotations__': {'graph': 'Optional[Graph]', '_dataset': 'Optional[ConjunctiveGraph]', 'prologue': 'Optional[Prologue]', '_now': 'Optional[datetime.datetime]', 'bnodes': 't.MutableMapping[Identifier, BNode]'}})¶
- __firstlineno__ = 249¶
- __init__(graph=None, bindings=None, initBindings=None)[source]¶
- Parameters:
graph (
Optional
[Graph
]) –bindings (
Union
[Bindings
,FrozenBindings
,List
[Any
],None
]) –initBindings (
Optional
[Mapping
[str
,Identifier
]]) –
- __module__ = 'rdflib.plugins.sparql.sparql'¶
- __static_attributes__ = ('initBindings', 'prologue', 'bindings', 'dataset', 'graph', 'bnodes', '_dataset', '_now')¶
- __weakref__¶
list of weak references to the object
- clone(bindings=None)[source]¶
- Parameters:
bindings (
Union
[Bindings
,FrozenBindings
,List
[Any
],None
]) –- Return type:
- property dataset: ConjunctiveGraph¶
“current dataset
- Return type:
- load(source, default=False, **kwargs)[source]¶
Load data from the source into the query context’s.
- Parameters:
source (
URIRef
) – The source to load from.default (
bool
) – IfTrue
, triples from the source will be added to the default graph, otherwise it will be loaded into a graph withsource
URI as its name.kwargs (
Any
) – Keyword arguments to pass tordflib.graph.Graph.parse()
.
- Return type:
None
- property now: datetime¶
- Return type:
datetime
- solution(vars=None)[source]¶
Return a static copy of the current variable bindings as dict
- Parameters:
vars (
Optional
[Iterable
[Variable
]]) –- Return type:
- thaw(frozenbindings)[source]¶
Create a new read/write query context from the given solution
- Parameters:
frozenbindings (
FrozenBindings
) –- Return type:
- exception rdflib.plugins.sparql.sparql.SPARQLError(msg=None)[source]¶
Bases:
Exception
- Parameters:
msg (
Optional
[str
]) –
- __annotations__ = {}¶
- __firstlineno__ = 37¶
- __module__ = 'rdflib.plugins.sparql.sparql'¶
- __static_attributes__ = ()¶
- __weakref__¶
list of weak references to the object
- exception rdflib.plugins.sparql.sparql.SPARQLTypeError(msg)[source]¶
Bases:
SPARQLError
- Parameters:
msg (
Optional
[str
]) –
- __annotations__ = {}¶
- __firstlineno__ = 54¶
- __module__ = 'rdflib.plugins.sparql.sparql'¶
- __static_attributes__ = ()¶
- class rdflib.plugins.sparql.sparql.Update(prologue, algebra)[source]¶
Bases:
object
A parsed and translated update
- __dict__ = mappingproxy({'__module__': 'rdflib.plugins.sparql.sparql', '__firstlineno__': 464, '__doc__': '\nA parsed and translated update\n', '__init__': <function Update.__init__>, '__static_attributes__': ('prologue', 'algebra'), '__dict__': <attribute '__dict__' of 'Update' objects>, '__weakref__': <attribute '__weakref__' of 'Update' objects>, '__annotations__': {'_original_args': 'Tuple[str, Mapping[str, str], Optional[str]]'}})¶
- __firstlineno__ = 464¶
- __module__ = 'rdflib.plugins.sparql.sparql'¶
- __static_attributes__ = ('prologue', 'algebra')¶
- __weakref__¶
list of weak references to the object
rdflib.plugins.sparql.update module¶
Code for carrying out Update Operations
- rdflib.plugins.sparql.update.evalAdd(ctx, u)[source]¶
add all triples from src to dst
http://www.w3.org/TR/sparql11-update/#add
- Parameters:
ctx (
QueryContext
) –u (
CompValue
) –
- Return type:
None
- rdflib.plugins.sparql.update.evalClear(ctx, u)[source]¶
http://www.w3.org/TR/sparql11-update/#clear
- Parameters:
ctx (
QueryContext
) –u (
CompValue
) –
- Return type:
None
- rdflib.plugins.sparql.update.evalCopy(ctx, u)[source]¶
remove all triples from dst add all triples from src to dst
http://www.w3.org/TR/sparql11-update/#copy
- Parameters:
ctx (
QueryContext
) –u (
CompValue
) –
- Return type:
None
- rdflib.plugins.sparql.update.evalCreate(ctx, u)[source]¶
http://www.w3.org/TR/sparql11-update/#create
- Parameters:
ctx (
QueryContext
) –u (
CompValue
) –
- Return type:
None
- rdflib.plugins.sparql.update.evalDeleteData(ctx, u)[source]¶
http://www.w3.org/TR/sparql11-update/#deleteData
- Parameters:
ctx (
QueryContext
) –u (
CompValue
) –
- Return type:
None
- rdflib.plugins.sparql.update.evalDeleteWhere(ctx, u)[source]¶
http://www.w3.org/TR/sparql11-update/#deleteWhere
- Parameters:
ctx (
QueryContext
) –u (
CompValue
) –
- Return type:
None
- rdflib.plugins.sparql.update.evalDrop(ctx, u)[source]¶
http://www.w3.org/TR/sparql11-update/#drop
- Parameters:
ctx (
QueryContext
) –u (
CompValue
) –
- Return type:
None
- rdflib.plugins.sparql.update.evalInsertData(ctx, u)[source]¶
http://www.w3.org/TR/sparql11-update/#insertData
- Parameters:
ctx (
QueryContext
) –u (
CompValue
) –
- Return type:
None
- rdflib.plugins.sparql.update.evalLoad(ctx, u)[source]¶
http://www.w3.org/TR/sparql11-update/#load
- Parameters:
ctx (
QueryContext
) –u (
CompValue
) –
- Return type:
None
- rdflib.plugins.sparql.update.evalModify(ctx, u)[source]¶
- Parameters:
ctx (
QueryContext
) –u (
CompValue
) –
- Return type:
None
- rdflib.plugins.sparql.update.evalMove(ctx, u)[source]¶
remove all triples from dst add all triples from src to dst remove all triples from src
http://www.w3.org/TR/sparql11-update/#move
- Parameters:
ctx (
QueryContext
) –u (
CompValue
) –
- Return type:
None
- rdflib.plugins.sparql.update.evalUpdate(graph, update, initBindings=None)[source]¶
http://www.w3.org/TR/sparql11-update/#updateLanguage
‘A request is a sequence of operations […] Implementations MUST ensure that operations of a single request are executed in a fashion that guarantees the same effects as executing them in lexical order.
Operations all result either in success or failure.
If multiple operations are present in a single request, then a result of failure from any operation MUST abort the sequence of operations, causing the subsequent operations to be ignored.’
This will return None on success and raise Exceptions on error
Caution
This method can access indirectly requested network endpoints, for example, query processing will attempt to access network endpoints specified in
SERVICE
directives.When processing untrusted or potentially malicious queries, measures should be taken to restrict network and file access.
For information on available security measures, see the RDFLib Security Considerations documentation.
- Parameters:
graph (
Graph
) –update (
Update
) –initBindings (
Optional
[Mapping
[str
,Identifier
]]) –
- Return type:
None
Module contents¶
SPARQL implementation for RDFLib
New in version 4.0.
- rdflib.plugins.sparql.CUSTOM_EVALS = {}¶
Custom evaluation functions
These must be functions taking (ctx, part) and raise NotImplementedError if they cannot handle a certain part
- rdflib.plugins.sparql.prepareQuery(queryString, initNs=None, base=None)[source]¶
Parse and translate a SPARQL Query
- Parameters:
queryString (
str
) –initNs (
Optional
[Mapping
[str
,Any
]]) –base (
Optional
[str
]) –
- Return type:
- rdflib.plugins.sparql.prepareUpdate(updateString, initNs=None, base=None)[source]¶
Parse and translate a SPARQL Update
- Parameters:
updateString (
str
) –initNs (
Optional
[Mapping
[str
,Any
]]) –base (
Optional
[str
]) –
- Return type:
- rdflib.plugins.sparql.processUpdate(graph, updateString, initBindings=None, initNs=None, base=None)[source]¶
Process a SPARQL Update Request returns Nothing on success or raises Exceptions on error
- Parameters:
graph (
Graph
) –updateString (
str
) –initBindings (
Optional
[Mapping
[str
,Identifier
]]) –initNs (
Optional
[Mapping
[str
,Any
]]) –base (
Optional
[str
]) –
- Return type:
None