Latest Post

Fluentd error: Unable to push logs to [elasticsearch]

After application deployments, Kibana stopped showing logs exactly after 7 days. The error "Fluentd error: Unable to push logs to [elasticsearch]" was shown in the fluentd logs. The initial response was to increase the buffer limits for fluentd as follows: chunk_limit_size 10M queue_limit_length 256 The behavior occurred again after two weeks, which led to the same error. On closer investigation, the error was preceded by the statement "Failed to write to the buffer." This led me to inspect the fluentd configuration again and found the following code in the buffer part which caused the fluentd buffers to be filled as per the official documentation on Fluentd : overflow_action block The fix for this overflow_action is to change from block to drop_oldest_chunk, allowing the fluentd logs to flow seamlessly to the elastic search by dropping the oldest logs in the buffer.   <buffer> @type file path /var/log/fluentd-buffers/kubernet...

Publishing SCOM certificate for workgroup computer


1. Create the certificate request as below:

 a. Create request.inf with the following configuration:

  [NewRequest]
  Subject="CN=<Servername>"
  Exportable=TRUE
  KeyLength=1024
  KeySpec=1
  KeyUsage=0xf0
  MachineKeySet=TRUE
  [EnhancedKeyUsageExtension]
  OID=1.3.6.1.5.5.7.3.1
  OID=1.3.6.1.5.5.7.3.2

 b. Run the following command to create the request from the request.inf created above:

     
certreq -new -f RequestConfig.inf BinaryRequest.req


2. Submit the request to the CA (Standalone or Enterprise) and export the certificate as pfx.

3. In the Workgroup computer, run the following commands:

MOMCertImport.exe /Remove
MOMCertImport.exe C:\<certificate name>.pfx /Password <Password>


Comments

Popular posts from this blog

On-board Linux computers to Azure Log Analytics

Office 365 User unable to book room on-premise in Exchange Hybrid environment

Fluentd error: Unable to push logs to [elasticsearch]