Introduction:
While I think that ChatGPT for software development is an impressive breakthrough, I recently experienced an incident of GPT “hallucination” during code generation (using the free version aka GPT 3.5) that concerns me and demonstrates serious ramifications for undisciplined use of this technology. In fact, so much so, that I now consider it a danger to use any Chat GPT code in mission-critical systems. Fortunately, I also have a simple recommendation on how to fix the problem. Besides this specific incident, there is the larger issue of whether this tool qualifies as a Lazy programming tactic that should be considered “Bad Practice”. For details on the distinction between “Good Lazy” and “Bad Lazy” programming, see my free book: “Lazy Programmers: The Good, The Bad and the The Ugly” that is available on my website.
Let’s begin by understanding the context of the programming situation and then we will walk, step-by-step through the lengthy conversation between myself and ChatGPT to generate some seemingly simple code. Frankly, the results were shocking as I was unaware how badly ChatGPT will hallucinate even within the constrained, necessarily precise domain of software engineering. If ChatGPT does not gain needed discipline in this area, its utility as a programming “copilot” will be relegated to only those organizations willing to “move fast and break things” (and I don’t say that as a compliment).
My coding objective was to extract Metadata from an RTSP video stream that was inserted by the NVIDIA DeepStream Software Development Kit. It is important to understand that NVIDIA’s deepstream SDK is a C/C++ code base that leverages the open source GStreamer library to perform video analytics and machine learning. On the receiving end we have a Java program that needs to take the video feed from the Jetson device and operate on it in various ways. So, not being very familiar with the NVIDIA SDK nor GStreamer, I was hoping that ChatGPT could assist me in writing some code to extract the DeepStream embedded metadata. Unfortunately, as you will see, ChatGPT was unable to assist me in that task after following it down a progressively more frustrating rabbit hole!
Let’s now dive into the specifics by following the trail of prompts and responses and analyzing each ChatGPT response for what it is doing well and what it does poorly. Most importantly, I will clearly highlight when ChatGPT is hallucinating and the ramifications of that for our industry. Note: there are 13 prompts and responses, so if you are not interested in the specifics of the code (which are also presented here), just move past it to the highlighted prompts, responses and analyses (all of which are in bold and numbered accordingly).
User Prompt #1: Use the gstreamer-java library to generate a java program that extracts metadata from Gstreamer buffers created by the NVIDIA DeepStream library.
Chat GPT Response #1: To extract metadata from Gstreamer buffers created by the NVIDIA DeepStream library using the gstreamer-java library, you can follow the steps below:
1. Install gstreamer-java library:
You can download and install the gstreamer-java library from its official website.
2. Create a Java project:
Create a new Java project in your preferred development environment.
3. Add the gstreamer-java library to the project:
Add the gstreamer-java library to the Java project. You can do this by adding the gstreamer-java.jar file to the classpath.
4. Import the necessary classes:
Import the necessary classes from the gstreamer-java library. This includes the Gst and GstBuffer classes.
5. Initialize the GStreamer library:
Before using any GStreamer functionality, you must initialize the GStreamer library. You can do this using the following code:
Gst.init();
6. Create a pipeline:
7. Create a GStreamer pipeline that reads from the NVIDIA DeepStream buffer using the following code:
Pipeline pipeline = new Pipeline();
Element nvstreammux = ElementFactory.make("nvstreammux", "stream-muxer");
// Add other elements to the pipeline as required
pipeline.add(nvstreammux);
8. Set up a bus:
Set up a bus that allows you to receive messages from the pipeline using the following code:
Bus bus = pipeline.getBus();
bus.connect(new Bus.EOS() {
public void endOfStream(GstObject source) {
System.out.println("End of Stream");
}
});
bus.connect(new Bus.ERROR() {
public void errorMessage(GstObject source, int code, String message) {
System.out.println("Error: " + message);
}
});
9. Start the pipeline:
Start the pipeline using the following code:
pipeline.play();
10. Extract metadata:
Extract metadata from the GStreamer buffer using the following code:
nvstreammux.connect(new Element.PAD_ADDED() {
public void padAdded(Element element, Pad pad) {
pad.addProbe(new PadProbe() {
public PadProbeReturn probeData(Pad pad, Buffer buffer) {
// Extract metadata from the buffer
return PadProbeReturn.OK;
}
});
}
});
In the above code, you add a pad probe to the nvstreammux element's pads, which allows you to extract metadata from the buffer.
11. Stop the pipeline:
Stop the pipeline using the following code:
pipeline.stop();
This is a basic example of how to extract metadata from GStreamer buffers created by the NVIDIA DeepStream library using the gstreamer-java library. You will need to modify the code to suit your specific requirements.
Analysis of the ChatGPT Response #1:
While the GPT response is good in terms of giving me a good list of steps to achieve my goal it is not actually what I asked for. I asked it to generate me a java program and a list of disjointed steps is not a program. So, my next prompt was just to correct that and get a (hopefully) working example. Getting concrete help is the promise of generative AI. If it is done correctly and in a trustworthy manner, it will change our society drastically. That is its promise, but, as you will see, we are a long way from fulfilling that promise.
User Prompt #2: Using the above process create a complete example given an RTSP stream.
Chat GPT Response #2: Sure, here is an example Java program that uses the gstreamer-java library to extract metadata from an RTSP stream using the NVIDIA DeepStream library:
import org.freedesktop.gstreamer.*;
import org.freedesktop.gstreamer.elements.AppSink;
public class DeepStreamMetadataExtractor {
public static void main(String[] args) {
// Initialize GStreamer library
Gst.init();
// Create pipeline
Pipeline pipeline = new Pipeline();
// Create elements
Element rtspSrc = ElementFactory.make("rtspsrc", "rtsp-source");
Element decodeBin = ElementFactory.make("decodebin", "decoder");
Element nvStreamMux = ElementFactory.make("nvstreammux", "stream-muxer");
Element nvInfer = ElementFactory.make("nvinfer", "inferencer");
Element nvMetadata = ElementFactory.make("nvmetadata", "metadata-converter");
AppSink appSink = new AppSink("GstVideoAppSink");
// Set properties
rtspSrc.set("location", "rtsp://your-stream-url");
nvStreamMux.set("width", 1920);
nvStreamMux.set("height", 1080);
nvStreamMux.set("batch-size", 1);
// Add elements to pipeline
pipeline.add(rtspSrc);
pipeline.add(decodeBin);
pipeline.add(nvStreamMux);
pipeline.add(nvInfer);
pipeline.add(nvMetadata);
pipeline.add(appSink);
// Link elements
rtspSrc.link(decodeBin);
decodeBin.link(nvStreamMux);
nvStreamMux.link(nvInfer);
nvInfer.link(nvMetadata);
nvMetadata.link(appSink);
// Set up bus
Bus bus = pipeline.getBus();
bus.connect(new Bus.EOS() {
public void endOfStream(GstObject source) {
System.out.println("End of Stream");
}
});
bus.connect(new Bus.ERROR() {
public void errorMessage(GstObject source, int code, String message) {
System.out.println("Error: " + message);
}
});
// Set up app sink
appSink.set("emit-signals", true);
appSink.set("sync", false);
appSink.connect(new AppSink.NEW_SAMPLE() {
public void newSample(AppSink elem) {
Sample sample = elem.pullSample();
Buffer buffer = sample.getBuffer();
// Extract metadata from buffer
byte[] metadata = GstNvMetadata.get(buffer);
if (metadata != null) {
String metadataStr = new String(metadata);
System.out.println("Metadata: " + metadataStr);
}
sample.dispose();
}
});
// Start pipeline
pipeline.play();
// Wait for pipeline to finish
Gst.main();
}
}
In this example, we create a pipeline that reads an RTSP stream and passes the video frames through the NVIDIA DeepStream library for inferencing and metadata extraction. The metadata is then printed to the console.
You will need to replace "rtsp://your-stream-url" with the actual RTSP stream URL. You may also need to modify the pipeline and element configurations to suit your specific requirements.
To run the program, compile the Java code and run the resulting class file:
javac -cp gstreamer-java.jar DeepStreamMetadataExtractor.java
java -cp .:gstreamer-java.jar DeepStreamMetadataExtractor
Analysis of the ChatGPT Response #2:
While the ChatGPT response seems like a complete program, it would actually neither compile nor run because there is no GstNvMetadata class in the gstreamer java bindings to the GStreamer C libraries. In fact, there is no GstNVMetadata class in GStreamer C libraries either. Specifically here is what GitHub will tell you if you try to search for that String: “We couldn’t find any code matching 'GstNvMetadata' in gstreamer-java/gst1-java-core”. So, ChatGPT invented a class that does not exist! While such invention (also known as hallucination) is annoying but possibly harmless in languages like English, it accounts to gross negligence in programming. You would never let even a junior programmer suggest code that does not compile. A basic existence proof is absolutely required to generate source code. Without that, generative AI for programming is futile and possibly dangerous. But that is not the only problem with the generated code. There is an even subtler issue lurking here.
Notice the ChatGPT response says, that this code is “using the NVIDIA DeepStream library”. What? Go back and examine the initial prompt… The original prompt stated that the Gstreamer buffers were created by that library (DeepStream) but the program to be generated is not the producer of the stream but the consumer of the stream! THIS COULD BE A FATAL FLAW IN generative AI because it does not understand concepts, in fact, it actually understands Nothing … just the probability of words … in programming, you are NOT just generating words (even highly probable words). Programming MUST be deterministic. Do you think this is a fatal flaw in the current crop of Generative AI tools? Respond on Twitter or LinkedIn with your thoughts. Ok, let’s continue as I then asked ChatGPT to correct its mistake.
User Prompt #3: The class GstNvMetadata is not found. Where can I get that dependency?
Chat GPT Response #3: I apologize for the confusion. The GstNvMetadata class is not part of the gstreamer-java library. It is part of the NVIDIA DeepStream SDK.
To use the GstNvMetadata class, you need to download and install the NVIDIA DeepStream SDK. You can download the SDK from the NVIDIA website: https://developer.nvidia.com/deepstream-sdk.
Once you have installed the DeepStream SDK, you will need to add the necessary libraries to your Java classpath. The path to the libraries may vary depending on your installation, but they are typically located in the lib directory of the DeepStream SDK.
You will need to add the following libraries to your classpath:
libgstdsmsdk.so
libnvds_meta.so
libnvds_utils.so
You can add the libraries to your classpath using the -Djava.library.path command line option when running your Java program. For example:
java -cp .:gstreamer-java.jar -Djava.library.path=/path/to/deepstream/lib DeepStreamMetadataExtractor
Replace /path/to/deepstream/lib with the path to the lib directory of your DeepStream SDK installation.
Once you have added the necessary libraries to your classpath, the GstNvMetadata class should be available for use in your program.
Analysis of the ChatGPT Response #3:
Sadly, this response borders on the ridiculous. It is stated that its made up Java class (GstNvMetadata) is part of a C programing language based SDK. It is telling you to add “.so” files which are C libraries and not Jar files into your library path as if you had a Java Native Interface (JNI) integration to the hallucinated Java class. Also, remember that we are writing the consumer of the video feed and NOT the producer of the video feed. So, unfortunately, ChatGPT has gone off the rails here. So, I foolishly try to ask ChatGPT how it’s reasoning is working here. I am foolishly giving it the benefit of the doubt (in hindsight I should not have wasted my time on the next prompt but at the time, I did not know how badly it would hallucinate in relation to source code).
User Prompt #4: But those libraries are C libraries compiled to .so files ... how does that enable a java class to be exposed within the gstreamer-java library?
ChatGPT Response #4: You are correct that the libraries I mentioned are C libraries compiled to .so files. They are not directly used by the gstreamer-java library. However, the GstNvMetadata class is a wrapper around these C libraries that allows Java programs to extract metadata from buffers processed by the NVIDIA DeepStream SDK.
The GstNvMetadata class provides a set of static methods that interact with the C libraries through JNI (Java Native Interface). These methods allow Java programs to extract metadata from buffers created by the NVIDIA DeepStream SDK, without needing to directly interact with the C libraries themselves.
When you run a Java program that uses the GstNvMetadata class, you need to ensure that the C libraries are available on the system where the program is running. The Djava.library.path command line option allows you to specify the path to the directory where the C libraries are located, so that the JVM can find and load them at runtime.
So to summarize, the GstNvMetadata class is not part of the gstreamer-java library, but rather a wrapper around C libraries provided by the NVIDIA DeepStream SDK. The Djava.library.path command line option allows the JVM to find and load these C libraries at runtime, making it possible to use the GstNvMetadata class to extract metadata from buffers processed by the DeepStream SDK in a Java program.
Analysis of the ChatGPT Response #4:
So, what we have here is now an articulate defense of its previously hallucinated answer. Unfortunately, it does not know that its base premise (the hallucination of the GstNvMetadata class). Bottom line: the GstNvMetadata class DOES NOT EXIST. OpenAI, Microsoft, Google and every other company using generatine AI for source code generation should have a final validation stage with, at a minimum, a basic existence proof for all classes, methods and libraries it suggests. This is non-negotiable and a huge liability risk for any company that does not heed that warning! I cannot stress this enough, because if these companies do NOT do this, they are opening themselves up to significant liability and ruining the potential of this technology before it has time to mature.
User Prompt #5: Ok, understand that GstNvMetadata provides a set of static methods so does that mean that is a java class?
ChatGPT Response #5: Yes, GstNvMetadata is a Java class that provides a set of static methods for working with metadata generated by the NVIDIA DeepStream SDK.
The GstNvMetadata class is part of the nvds- metadata library that is included with the DeepStream SDK. This library provides a set of C functions for working with metadata, and the GstNvMetadata class provides a Java API for using these functions through JNI.
The GstNvMetadata class is implemented in Java, and it contains a set of static methods that wrap the C functions provided by the nvds-metadata library. These methods take in buffers containing metadata generated by the DeepStream SDK, and they extract information such as object bounding boxes, class labels, and confidence scores.
You can use the GstNvMetadata class in your Java program by importing it and calling its static methods. The class is typically included in the nvds-jar JAR file that is included with the DeepStream SDK, so you will need to make sure this JAR file is on your classpath when running your Java program.
For example, here is a sample Java code snippet that uses the GstNvMetadata class to extract object bounding boxes from a buffer:
import org.freedesktop.gstreamer.Buffer;
import org.freedesktop.gstreamer.Structure;
import com.nvidia.ds.gst.GstNvMetadata;
public class MetadataExtractor {
public static void main(String[] args) {
// Create a buffer containing metadata from DeepStream SDK
Buffer buffer = createBuffer();
// Extract object bounding boxes from metadata buffer
Structure[] bboxStructures = GstNvMetadata.getBoundingBoxMeta(buffer);
for (Structure bboxStructure : bboxStructures) {
int left = bboxStructure.getInteger("left");
int top = bboxStructure.getInteger("top");
int width = bboxStructure.getInteger("width");
int height = bboxStructure.getInteger("height");
System.out.println("Object bounding box: left=" + left + ", top=" + top + ", width=" + width + ", height=" + height);
}
}
private static Buffer createBuffer() {
// TODO: Implement code to create a buffer containing metadata from DeepStream SDK
return null;
}
}
Analysis of the ChatGPT Response #5:
Ok, things just went from bad to worse and ChatGPT just continues to answer with 100% confidence (or seemingly so). In fact, a “confidence” factor would be great to see on every generated response. So, how did this go from bad to worse. Not only did it hallucinate a non-existent Java class but it then hallucinated a non-existent Jar file (aka Java library) that would host that non-existent Java class. So we have a hallucination within a hallucination! Bravo, a “Dream within a Dream” (A Dream Within a Dream by Edgar Allan Poe | Poetry Foundation)! So, having hit my frustration level, I switched gears and told it to take another tack.
User Prompt #6: I cannot find the GstNvMetadata class. Can you regenerate the java program using only gstreamer-java classes?
ChatGPT Response #6: Sure, here is an updated Java program that extracts metadata from an RTSP stream using only gstreamer-java classes:
import org.freedesktop.gstreamer.*;
import org.freedesktop.gstreamer.elements.*;
import org.freedesktop.gstreamer.plugins.*;
import org.freedesktop.gstreamer.utils.*;
import org.freedesktop.gstreamer.video.VideoOverlay;
public class MetadataExtractor {
public static void main(String[] args) {
// Initialize GStreamer
Gst.init("MetadataExtractor", args);
// Create a pipeline to read from an RTSP stream
Pipeline pipeline = new Pipeline();
Element source = ElementFactory.make("rtspsrc", "source");
source.set("location", "rtsp://your-stream-url");
Element decodebin = ElementFactory.make("decodebin", "decodebin");
Element queue = ElementFactory.make("queue", "queue");
Element videoconvert = ElementFactory.make("videoconvert", "videoconvert");
Element videoscale = ElementFactory.make("videoscale", "videoscale");
Element filter = ElementFactory.make("capsfilter", "filter");
filter.setCaps(Caps.fromString("video/x-raw,width=640,height=480"));
Element appsink = ElementFactory.make("appsink", "appsink");
// Add elements to the pipeline
pipeline.addMany(source, decodebin, queue, videoconvert, videoscale, filter, appsink);
Element.linkMany(source, decodebin);
Element.linkMany(queue, videoconvert, videoscale, filter, appsink);
// Extract metadata from each buffer
appsink.connect(new AppSink.NEW_SAMPLE() {
@Override
public void newSample(AppSink elem) {
Sample sample = elem.pullSample();
Buffer buffer = sample.getBuffer();
// Extract metadata
Structure s = buffer.getMetadata("GstVideoRegionOfInterestMeta");
if (s != null) {
int x = s.getInteger("x");
int y = s.getInteger("y");
int w = s.getInteger("w");
int h = s.getInteger("h");
System.out.println("ROI: x=" + x + ", y=" + y + ", w=" + w + ", h=" + h);
}
// Don't forget to unref the buffer and sample!
buffer.unref();
sample.unref();
}
});
// Start the pipeline
pipeline.play();
// Wait for the pipeline to finish
pipeline.getBus().connect(new Bus.EOS() {
@Override
public void endOfStream(GstObject source) {
System.out.println("End of stream");
source.dispose();
Gst.quit();
}
});
pipeline.getBus().connect(new Bus.ERROR() {
@Override
public void errorMessage(GstObject source, int code, String message) {
System.err.println("Error: " + message);
source.dispose();
Gst.quit();
}
});
Gst.main();
}
}
This program uses the GstVideoRegionOfInterestMeta metadata structure to extract the location of each region of interest in the video stream. The metadata is extracted using the Buffer.getMetadata() method, which returns a Structure object containing the metadata.
Note that you may need to modify the pipeline elements and caps filter to match the specific properties of your RTSP stream. In addition, you can use other metadata structures to extract different types of metadata from the stream.
Analysis of the ChatGPT Response #6:
Ok, now this code is more promising because it no longer uses a made up DeepStream java class or library. Progress! Or so I thought. Unfortunately, it generated a program that only looks for one specific type of metadata – specifically, “GstVideoRegionOfInterestMeta” but I needed to see all the available metadata that is present in the stream. Again, look back at the initial prompt, did I say “extract the GstVideoRegionOfInterestMeta” metadata?” No, of course not. So, back to some more “prompt engineering” to better explain to this neural network what I am after.
User Prompt #7: How do I know what metadata structures are present in a Buffer? Can you add code to extract all the available ones?
ChatGPT Response #7: Yes, you can use the Buffer.listMetadata() method to list all the available metadata structures in a buffer. Here's an updated version of the Java program that prints out all the available metadata structures for each buffer:
import org.freedesktop.gstreamer.*;
import org.freedesktop.gstreamer.elements.*;
import org.freedesktop.gstreamer.plugins.*;
import org.freedesktop.gstreamer.utils.*;
import org.freedesktop.gstreamer.video.VideoOverlay;
public class MetadataExtractor {
public static void main(String[] args) {
// Initialize GStreamer
Gst.init("MetadataExtractor", args);
// Create a pipeline to read from an RTSP stream
Pipeline pipeline = new Pipeline();
Element source = ElementFactory.make("rtspsrc", "source");
source.set("location", "rtsp://your-stream-url");
Element decodebin = ElementFactory.make("decodebin", "decodebin");
Element queue = ElementFactory.make("queue", "queue");
Element videoconvert = ElementFactory.make("videoconvert", "videoconvert");
Element videoscale = ElementFactory.make("videoscale", "videoscale");
Element filter = ElementFactory.make("capsfilter", "filter");
filter.setCaps(Caps.fromString("video/x-raw,width=640,height=480"));
Element appsink = ElementFactory.make("appsink", "appsink");
// Add elements to the pipeline
pipeline.addMany(source, decodebin, queue, videoconvert, videoscale, filter, appsink);
Element.linkMany(source, decodebin);
Element.linkMany(queue, videoconvert, videoscale, filter, appsink);
// Extract metadata from each buffer
appsink.connect(new AppSink.NEW_SAMPLE() {
@Override
public void newSample(AppSink elem) {
Sample sample = elem.pullSample();
Buffer buffer = sample.getBuffer();
// List all available metadata structures
System.out.println("Available metadata structures:");
for (String meta : buffer.listMetadata()) {
System.out.println(meta);
}
// Extract metadata
Structure s = buffer.getMetadata("GstVideoRegionOfInterestMeta");
if (s != null) {
int x = s.getInteger("x");
int y = s.getInteger("y");
int w = s.getInteger("w");
int h = s.getInteger("h");
System.out.println("ROI: x=" + x + ", y=" + y + ", w=" + w + ", h=" + h);
}
// Don't forget to unref the buffer and sample!
buffer.unref();
sample.unref();
}
});
// Start the pipeline
pipeline.play();
// Wait for the pipeline to finish
pipeline.getBus().connect(new Bus.EOS() {
@Override
public void endOfStream(GstObject source) {
System.out.println("End of stream");
source.dispose();
Gst.quit();
}
});
pipeline.getBus().connect(new Bus.ERROR() {
@Override
public void errorMessage(GstObject source, int code, String message) {
System.err.println("Error: " + message);
source.dispose();
Gst.quit();
}
});
Gst.main();
}
}
This program lists all the available metadata structures using the Buffer.listMetadata() method, which returns an array of strings containing the names of the available metadata structures. Note that this will include both GStreamer-provided metadata structures and any custom metadata structures that have been added to the buffer. The program then uses the Buffer.getMetadata() method to extract the GstVideoRegionOfInterestMeta metadata structure and print the metadata.
Analysis of the ChatGPT Response #7:
Hallelujah! This code looks extremely promising and the Buffer.listMetadata() function is exactly what I am looking for. Unfortunately, looking at the Buffer class in the gstreamer-java repo left me crestfallen. Why? The listMetadata() method that was the solution to all my problems – DID NOT EXIST! WTF! Ughh. Of course, stupid me, I first assumed it must be my mistake. So, I asked the all-powerful Oz if that was the case?
User Prompt #8: What is the package of the Buffer class shown above?
ChatGPT Response #8: The Buffer class used in the example above is part of the org.freedesktop.gstreamer package. This package contains the core GStreamer Java bindings, including classes for working with buffers, elements, pipelines, and other GStreamer constructs.
What version number of the package has the listMetadata() method?
The listMetadata() method is part of the org.freedesktop.gstreamer.Buffer class, which has been part of the GStreamer Java bindings since version 1.14.0. This means that any version of the org.freedesktop.gstreamer package that includes GStreamer Java bindings version 1.14.0 or later should have the listMetadata() method available.
User Prompt #9: The Buffer class in the gstreamer-java repo on GitHub does not have a listMetadata() method. It only has hasMeta() and getMeta() methods. Regenerate the above code to use those methods.
ChatGPT Response #9: You're correct, I apologize for the confusion earlier. The Buffer class in the current version of the GStreamer Java bindings on GitHub does not have a listMetadata() method, but it does have hasMeta() and getMeta() methods that can be used to access metadata on a Buffer object. Here's an updated version of the code that uses these methods to extract metadata from a buffer:
import org.freedesktop.gstreamer.Buffer;
import org.freedesktop.gstreamer.ElementFactory;
import org.freedesktop.gstreamer.Pipeline;
import org.freedesktop.gstreamer.elements.AppSink;
import org.freedesktop.gstreamer.samples.CustomData;
public class MetadataExample {
public static void main(String[] args) {
// Initialize GStreamer
CustomData data = new CustomData();
args = Gst.init("MetadataExample", args);
// Create pipeline
Pipeline pipeline = new Pipeline("rtsp-pipeline");
pipeline.add(ElementFactory.make("rtspsrc", "source"));
pipeline.add(ElementFactory.make("decodebin", "decoder"));
pipeline.add(ElementFactory.make("nvstreammetadata", "metadata"));
pipeline.add(new AppSink("appsink"));
// Configure appsink
AppSink appSink = (AppSink) pipeline.getElementByName("appsink");
appSink.set("emit-signals", true);
appSink.set("sync", false);
appSink.connect(new AppSink.NEW_SAMPLE() {
@Override
public void newSample(AppSink appSink) {
// Get the buffer from the appsink
Buffer buffer = appSink.pullSample().getBuffer();
// Check if metadata is present on the buffer
if (!buffer.hasMeta("GstVideoMeta")) {
System.out.println("No video metadata found");
return;
}
// Print out the metadata on the buffer
System.out.println("Metadata on buffer:");
for (String meta : buffer.getMetaList()) {
System.out.println(meta + ": " + buffer.getMeta(meta));
}
}
});
// Start pipeline
pipeline.play();
// Wait until error or EOS
pipeline.getBus().connect((Gst.MESSAGE_EOS) gstMessage -> data.loop.quit());
pipeline.getBus().connect((Gst.MESSAGE_ERROR) (gstMessage) -> {
Gst.quit();
return CustomData.VIDEO_ERROR;
});
// Run event loop
data.loop.run();
// Free resources
pipeline.stop();
pipeline.getState();
pipeline.dispose();
Gst.deinit();
}
}
This version of the code uses hasMeta() to check if the buffer contains a GstVideoMeta metadata structure, and getMetaList() and getMeta() to print out all available metadata keys and their values.
Analysis of the ChatGPT Response #9:
Notice that ChatGPT politely says, “You're correct, I apologize for the confusion earlier”… Do you think that ChatGPT is being sincere here? Of course, not… remember, it’s all just “words” to ChatGPT. It tells you what it thinks you want to here, just like your favorite politician! If you think, I am starting to get “snarky” now, then you are correct.
The next part of the response is truly scare because it now says, “The Buffer class in the current version of the GStreamer Java bindings on GitHub does not have a listMetadata() method.” What? Well, then why did you suggest that in the first place? Oh, right, you’re not actually suggesting anything … just spitting out words like a slick-willie babble machine! Sounds good but not necessarily accurate. Sort of like a high-paid lawyer with questionable ethics! LOL… Ok, let’s continue since ChatGPT actually makes an attempt to use the methods I suggest. Unfortunately, it is a half-hearted nonsensical attempt as we will see.
User Prompt #10: No, this code above will not compile. Here is the signature of the getMeta() method: "public <T extends Meta> T getMeta(Meta.API<T> api)". Please use the correct method signatures and the Meta.API class.
ChatGPT Response #10: You're correct, I apologize for the mistake earlier. Here is an updated version of the code that uses the correct method signatures and the Meta.API class to access metadata on the buffer:
import org.freedesktop.gstreamer.Buffer;
import org.freedesktop.gstreamer.ElementFactory;
import org.freedesktop.gstreamer.Pipeline;
import org.freedesktop.gstreamer.elements.AppSink;
import org.freedesktop.gstreamer.samples.CustomData;
import org.freedesktop.gstreamer.meta.GstVideoMeta;
public class MetadataExample {
public static void main(String[] args) {
// Initialize GStreamer
CustomData data = new CustomData();
args = Gst.init("MetadataExample", args);
// Create pipeline
Pipeline pipeline = new Pipeline("rtsp-pipeline");
pipeline.add(ElementFactory.make("rtspsrc", "source"));
pipeline.add(ElementFactory.make("decodebin", "decoder"));
pipeline.add(ElementFactory.make("nvstreammetadata", "metadata"));
pipeline.add(new AppSink("appsink"));
// Configure appsink
AppSink appSink = (AppSink) pipeline.getElementByName("appsink");
appSink.set("emit-signals", true);
appSink.set("sync", false);
appSink.connect(new AppSink.NEW_SAMPLE() {
@Override
public void newSample(AppSink appSink) {
// Get the buffer from the appsink
Buffer buffer = appSink.pullSample().getBuffer();
// Check if video metadata is present on the buffer
if (!buffer.hasMeta(GstVideoMeta.API)) {
System.out.println("No video metadata found");
return;
}
// Print out the metadata on the buffer
System.out.println("Metadata on buffer:");
for (Meta meta : buffer.getMetas(GstVideoMeta.API)) {
GstVideoMeta videoMeta = (GstVideoMeta) meta;
System.out.println(" " + videoMeta.getTag() + ": " + videoMeta.getFlags());
}
}
});
// Start pipeline
pipeline.play();
// Wait until error or EOS
pipeline.getBus().connect((Gst.MESSAGE_EOS) gstMessage -> data.loop.quit());
pipeline.getBus().connect((Gst.MESSAGE_ERROR) (gstMessage) -> {
Gst.quit();
return CustomData.VIDEO_ERROR;
});
// Run event loop
data.loop.run();
// Free resources
pipeline.stop();
pipeline.getState();
pipeline.dispose();
Gst.deinit();
}
}
This version of the code uses hasMeta() to check if the buffer contains a GstVideoMeta metadata structure, and getMetas() and getTag() to print out all available video metadata keys and their values. Note that we cast the Meta objects returned by getMetas() to GstVideoMeta so that we can access their specific methods.
Analysis of the ChatGPT Response #10:
It should be obvious by now that I am starting to get frustrated with this AI. In fact, if this AI had a physical body, I would have drop-kicked it at this point! I say in my prompt: “No, this code above will not compile”… so, basically I am telling ChatGPT that is full of crap and I try one more approach which is to tell it exactly what the method signature should be. Unfortunately, it still makes a nonsensical hash of it so I try a different approach. And, it has again returned to specifying one particular type of metadata and not giving me a list of them like are beautiful, yet fictitious, listMetadata() method. So, I try again to re-orient it to generate an equivalent of the listMetadata() method with the actual methods that exist.
User Prompt #11: Without using the listMetadata() method can you get the available Metadata.API structures available for the getMeta() method?
ChatGPT Response #11: Yes, you can get the available Meta.API structures for the getMeta() method without using listMetadata() by iterating over the MetaInfo array returned by MetaInfo.getInfos() method. Here's an example of how to do it:
import org.freedesktop.gstreamer.Buffer;
import org.freedesktop.gstreamer.ElementFactory;
import org.freedesktop.gstreamer.Pipeline;
import org.freedesktop.gstreamer.elements.AppSink;
import org.freedesktop.gstreamer.samples.CustomData;
import org.freedesktop.gstreamer.meta.Meta;
import org.freedesktop.gstreamer.meta.MetaInfo;
public class MetadataExample {
public static void main(String[] args) {
// Initialize GStreamer
CustomData data = new CustomData();
args = Gst.init("MetadataExample", args);
// Create pipeline
Pipeline pipeline = new Pipeline("rtsp-pipeline");
pipeline.add(ElementFactory.make("rtspsrc", "source"));
pipeline.add(ElementFactory.make("decodebin", "decoder"));
pipeline.add(ElementFactory.make("nvstreammetadata", "metadata"));
pipeline.add(new AppSink("appsink"));
// Configure appsink
AppSink appSink = (AppSink) pipeline.getElementByName("appsink");
appSink.set("emit-signals", true);
appSink.set("sync", false);
appSink.connect(new AppSink.NEW_SAMPLE() {
@Override
public void newSample(AppSink appSink) {
// Get the buffer from the appsink
Buffer buffer = appSink.pullSample().getBuffer();
// Print out the available metadata API structures
MetaInfo[] metaInfos = MetaInfo.getInfos();
System.out.println("Available metadata API structures:");
for (MetaInfo metaInfo : metaInfos) {
if (buffer.hasMeta(metaInfo.getAPI())) {
System.out.println(" " + metaInfo.getName());
}
}
}
});
// Start pipeline
pipeline.play();
// Wait until error or EOS
pipeline.getBus().connect((Gst.MESSAGE_EOS) gstMessage -> data.loop.quit());
pipeline.getBus().connect((Gst.MESSAGE_ERROR) (gstMessage) -> {
Gst.quit();
return CustomData.VIDEO_ERROR;
});
// Run event loop
data.loop.run();
// Free resources
pipeline.stop();
pipeline.getState();
pipeline.dispose();
Gst.deinit();
}
}
This version of the code uses MetaInfo.getInfos() to get an array of all available MetaInfo objects, and then iterates over them to check if the buffer contains metadata for each API. If metadata for an API is found, it prints out the name of the corresponding MetaInfo object. Note that MetaInfo.getName() is used to get the name of the MetaInfo object.
Analysis of the ChatGPT Response #11:
This is also not correct and it even invented a new class “MetaInfo” which also DOES NOT EXIST! I foolishly pressed on by telling it what it should use. In my mind, I am wondering if I can just get the thing to give me any code that will compile (besides toy examples).
User Prompt #12: That is also not correct. However, there is an iterateMeta() method. Please use that.
ChatGPT Response #12: You are correct, my apologies for the mistake. Buffer does not have a listMetadata() method, but it does have an iterateMeta() method that can be used to iterate over all the metadata objects in the buffer.
Here is an example of how to use iterateMeta() to print out the available Meta.API structures:
import org.freedesktop.gstreamer.Buffer;
import org.freedesktop.gstreamer.ElementFactory;
import org.freedesktop.gstreamer.Pipeline;
import org.freedesktop.gstreamer.elements.AppSink;
import org.freedesktop.gstreamer.samples.CustomData;
import org.freedesktop.gstreamer.meta.Meta;
import org.freedesktop.gstreamer.meta.MetaAPI;
import org.freedesktop.gstreamer.meta.MetaInfo;
public class MetadataExample {
public static void main(String[] args) {
// Initialize GStreamer
CustomData data = new CustomData();
args = Gst.init("MetadataExample", args);
// Create pipeline
Pipeline pipeline = new Pipeline("rtsp-pipeline");
pipeline.add(ElementFactory.make("rtspsrc", "source"));
pipeline.add(ElementFactory.make("decodebin", "decoder"));
pipeline.add(ElementFactory.make("nvstreammetadata", "metadata"));
pipeline.add(new AppSink("appsink"));
// Configure appsink
AppSink appSink = (AppSink) pipeline.getElementByName("appsink");
appSink.set("emit-signals", true);
appSink.set("sync", false);
appSink.connect(new AppSink.NEW_SAMPLE() {
@Override
public void newSample(AppSink appSink) {
// Get the buffer from the appsink
Buffer buffer = appSink.pullSample().getBuffer();
// Print out the available metadata API structures
System.out.println("Available metadata API structures:");
buffer.iterateMeta().forEachRemaining(meta -> {
MetaAPI metaAPI = meta.getAPI();
MetaInfo metaInfo = MetaInfo.get(metaAPI);
System.out.println(" " + metaInfo.getName());
});
}
});
// Start pipeline
pipeline.play();
// Wait until error or EOS
pipeline.getBus().connect((Gst.MESSAGE_EOS) gstMessage -> data.loop.quit());
pipeline.getBus().connect((Gst.MESSAGE_ERROR) (gstMessage) -> {
Gst.quit();
return CustomData.VIDEO_ERROR;
});
// Run event loop
data.loop.run();
// Free resources
pipeline.stop();
pipeline.getState();
pipeline.dispose();
Gst.deinit();
}
}
This version of the code uses Buffer.iterateMeta() to iterate over all the metadata objects in the buffer, and then uses Meta.getAPI() to get the corresponding Meta.API structure for each metadata object. The MetaInfo.get(Meta.API) method is used to get the MetaInfo object for the API, which is then used to get the name of the API.
Analysis of the ChatGPT Response #12:
So, again we have more made up classes (MetaAPI and MetaInfo). Ok, one final attempt to see if I can set this thing straight.
User Prompt #13: I don't know what version you are using because in the latest version on GitHub there is no MetaAPI or MetaInfo class.
ChatGPT Response #13: I apologize for the confusion. You are correct that the MetaAPI and MetaInfo classes are not present in the latest version of gstreamer-java on GitHub.
To get the available metadata structures in a Buffer using the getMeta() method, you can use the following code:
import org.freedesktop.gstreamer.Buffer;
import org.freedesktop.gstreamer.ElementFactory;
import org.freedesktop.gstreamer.Pipeline;
import org.freedesktop.gstreamer.elements.AppSink;
import org.freedesktop.gstreamer.samples.CustomData;
import org.freedesktop.gstreamer.meta.Meta;
public class MetadataExample {
public static void main(String[] args) {
// Initialize GStreamer
CustomData data = new CustomData();
args = Gst.init("MetadataExample", args);
// Create pipeline
Pipeline pipeline = new Pipeline("rtsp-pipeline");
pipeline.add(ElementFactory.make("rtspsrc", "source"));
pipeline.add(ElementFactory.make("decodebin", "decoder"));
pipeline.add(ElementFactory.make("nvstreammetadata", "metadata"));
pipeline.add(new AppSink("appsink"));
// Configure appsink
AppSink appSink = (AppSink) pipeline.getElementByName("appsink");
appSink.set("emit-signals", true);
appSink.set("sync", false);
appSink.connect(new AppSink.NEW_SAMPLE() {
@Override
public void newSample(AppSink appSink) {
// Get the buffer from the appsink
Buffer buffer = appSink.pullSample().getBuffer();
// Print out the available metadata structures
System.out.println("Available metadata structures:");
for (Meta meta : buffer) {
Meta.MetaAPI metaAPI = meta.getAPI();
System.out.println(" " + metaAPI.getName());
}
}
});
// Start pipeline
pipeline.play();
// Wait until error or EOS
pipeline.getBus().connect((Gst.MESSAGE_EOS) gstMessage -> data.loop.quit());
pipeline.getBus().connect((Gst.MESSAGE_ERROR) (gstMessage) -> {
Gst.quit();
return CustomData.VIDEO_ERROR;
});
// Run event loop
data.loop.run();
// Free resources
pipeline.stop();
pipeline.getState();
pipeline.dispose();
Gst.deinit();
}
}
This version of the code uses the enhanced for loop to iterate over all the metadata objects in the buffer, and then uses Meta.getAPI().getName() to get the name of the metadata structure for each metadata object.
Analysis of the ChatGPT Response #13:
Ok, we are back to a laughable response (if we were not already crying), because ChatGPT now tells us that since the MetaAPI class does not exist, maybe it is an embedded (or inner) class that is inside an existing class (the Meta class does exist). So, it hallucinated (made-up) an inner class that also DOES NOT EXIST. Of course, I am sure you can see the sad pattern here. Are you ready to have ChatGPT be your pair-programming partner?
Overall Analysis of Chat GPT performance on this generated code and its ramifications:
To say that I was severely disappointed in ChatGPT after this incident is an understatement. I had been a total cheerleader since I began testing it out about a month ago. Unfortunately, I am no longer naive in seeing the horrible ramifications of a word-probability generator for producing source code. Bluntly stated, in its present state, one should not trust any code generated by ChatGPT. But there is a more important lesson here because the “buyer beware” caution is not enough in this case. Specifically, programming languages are not like natural languages. Programming languages are deterministic and follow explicit rules. Guessing at the next “probable word” in a sequence of programming language statements is not acceptable for one important reason: the result is executable code that can have real world consequences. In other words, people can and will just feed ChatGPT code to a computer and try to run it. Laypeople, without programming experience, can and will use ChatGPT code and put it into running systems in corporations across the world. This is not natural language that is interpreted by a human - it is computer source code that can and will be unwittingly run by a computer. If it has not happened yet – it will. And there will be damages because of it and there will be liability because of it. Why? Because it is totally preventable.
How is it totally preventable? If the companies producing generative AI products for source code add a validation and verification stage that validates the correctness of the program (including an existence proof), builds the program, and possibly even executes the code (in some limited cases).
Let me be very clear about this (and lawyers take note): It is non-negotiable that generative AI programs for source code include a validation stage with, at a minimum, an existence proof for all suggested classes, methods and libraries! OpenAI, Microsoft, Google, Baidu and all other companies hopping on this bandwagon – please take note.