cybersource checkout API DOC developer.cybersource.com/content/cybsdeveloper2021/amer/en/library/documentation/dev_guides/Secure_Acceptance_Checkout_API/Secure_Acceptance_Checkout_API.pdf
i didn't know anything CyberSource payment gateway, I watched this playlist, I found it very helpful and KnowledgeStream is not only creating the best playlist but also helped me in the comment section as well :) really appreciated ❤ now I am a subscriber of KnowledgeStream and recommend to watch his videos as well :)
Hey I have one question. When creating the live account, after completing the form and submitting (step 3) do I wait for them to contact me before moving on to step 4? Or do I just go ahead and do the last step??
Thanks for your time on the channel. I have not created the repo fir the code base. I would suggest to ffollow all the steps mentioned in the video.. If yyou stuck any where just comment, I will help you .for a start, you can create account on cybersource. I alreeady created video on this.. Please like sshare and subscribe 😊.
Thanks alot for your feedback. I have checked the video dnt see the issue with voice. Sometimes because of network babdwidth we may face this kind of issue
There is no specific video has been created so far . There are multiple videos present on the channel with other system like S3 output etc which might help you. You can also ref this link for cloudwatch input plugin www.elastic.co/guide/en/logstash/current/plugins-inputs-cloudwatch.html
HI We want to integrate with an Device where there is no browser or internet connection Can i use proxy like below ? In this case Back End Servers think that, proxy is the browser. Cab we do this? This would look like below Our Client --> proxy --> Back End Servers
Hello bro thanks for this amazing video I have one question I need Help I want to pull my WAF logs from aws using Logstash with kinesis input into my local Elasticsearch cluster. have you ever tried that if not can you make a video about it or help me this issue. I have tried the kinesis input to pull WAF logs from kinesis firehose data delivery but I got errors can help me with that bro I can see you ae the only youtuber making this videos?
Hi, thanks alot for the feedback. So you want to get the data from kinessis firhose to elastic search via logstash, that's the use case right? Plz share config files/screenshot of logstash if possible.
Hey makinglearningezy thanks bro for your amazing playlist about logstash and ELK. we also want to use logstash to pull logs from kinesis firehouse now we use elastic agent to pull lWAF logs from AWS but it is slow so we need to send WAF logs to kinesis firehouse data delivery then logstash to pull using kinesis input can you make a about it. Thanks
hello bro thanks for the replay. here is my Logstash conf file for kinesis input input { kinesis { access_key_id => "*************" secret_access_key => "**********" kinesis_stream_name => "aws-waf-logs-test-123" region => "*****" } } output { stdout { codec => rubydebug } } when i try to run this is the error i'm getting [WARN ] 2023-11-19[LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified [INFO ] 2023-11-19 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false} [INFO ] 2023-11-19 [Converge PipelineAction::Create<main>] Reflections - Reflections took 858 ms to scan 1 urls, producing 132 keys and 464 values [ERROR] 2023-11-19 [Converge PipelineAction::Create<main>] kinesis - Unknown setting 'access_key_id' for kinesis [ERROR] 2023-11-19 [Converge PipelineAction::Create<main>] kinesis - Unknown setting 'secret_access_key' for kinesis [ERROR] 2023-11-19 [Converge PipelineAction::Create<main>] agent - Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"Java::JavaLang::IllegalStateException", :message=>"Unable to configure plugins: (ConfigurationError) Something is wrong with your configuration.", :backtrace=>["org.logstash.config.ir.CompiledPipeline.<init>(CompiledPipeline.java:120)", "org.logstash.execution.AbstractPipelineExt.initialize(AbstractPipelineExt.java:186)", "org.logstash.execution.AbstractPipelineExt$INVOKER$i$initialize.call(AbstractPipelineExt$INVOKER$i$initialize.gen)", "org.jruby.internal.runtime.methods.JavaMethod$JavaMethodN.call(JavaMethod.java:847)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuper(IRRuntimeHelpers.java:1318)", "org.jruby.ir.instructions.InstanceSuperInstr.interpret(InstanceSuperInstr.java:139)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:367)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:128)",
Thanks. I think nasking not available. Also Not much details available on the official doc. Just to mention, password would not be printed on the console.
Hello Ramakant. Your videos related to Logstah is awesome. In the internet i can’t see any video that explain in a nutshell like your videos. Tremendous effort. Thanks alot.
I have created similar conf file, there is no parsing happening in Logstash. no console log after Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
The Jmeter Log is in Text document Type and will be updating the data Live Can we make that Log to showcase in Kibana with real time monitoring . Becoz i'm confused how to integrate Jmeter Logs to ELK Could you do a Video on that within 2 days Becoz in my project i need to submit within friday So desperately need a video Plz Help
Have you tried to give jmeter log file path in filebeat. Filebeat should be present in the same machine where your log file present. If possible plz share the content of filebeat config file. In This video we have explained everything to read log file. ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-keKvsJUSTk8.html
Nice explaination , but is there any vedio available to understand how merchant/aggregator integration done in paymentgateway. Plz share will also bhi helpful
you can store the jmeter log in the machine where the filebeat would be running in other words you can use the filebeat in the machine where your jmeter present. Sofilbeat will read the jmeter log, it will send to logstash and logstash will send to elsstic search and kibana will show.