Creating Dynamic Types Generator Action

When using VRO Dynamic types with Dynamic Types Generator you may occasionally need to create an action to manually extract the required data.

It might not be immediately obvious what format that action needs to take. I found that by tracing the workflows through that the best action to use as an example is….

com.vmware.coe.http-rest/executeRequestJson

this action comes as a part of the Dynamic Types Generator Package.

Key things to not are types of Data passed and exactly what the return data is. It is the return data from the Rest Call in a specific format.

var properties = new Properties();
properties.put(“statusCode”, response.statusCode);
properties.put(“headers”, response.getAllHeaders());
properties.put(“contentLength”, response.contentLength);
properties.put(“requestFullUrl”, request.fullUrl);
properties.put(“contentAsString”, contentAsString);

The Action can contain whatever filtering you like, but you must honour the above format. In my scenario, I just a line to intercept the contentAsString and to filter it as was required.

Another word of caution, actions generated like this ARE NOT captured when performing a Dynamic Types Export. You need to manually add them to the package before continuing.

NSX-T Rest PowerShell Module

I needed to develop a solution to work with NSX-T that didn’t make use of the Binary Files included with the default NSX-T Power Shell Modules.

PowerShell is a fairly effective way to deal with Rest API’s, but can use some around the construction of the Header and Authentication Details.

To get myself started I put together a short Powershell module that contained

  • Connect-NsxtRestServer
  • Disconnect-NsxtRestServer
  • Invoke-NsxtRestMethod

https://github.com/greenscript-net/NsxtRest

And can be downloaded from the PowerShell Gallery

Depending on tasks I have on the ground, I will likely add functions in time to simplify working with NSX-T

VRSLCM Multi Deploy

Multi Deploy is now available in LCM & usually pretty reliable… but it isn’t immediately obvious how to start it.

By default when you select 2 or more items from the content page, the deploy button is greyed out.

You need to make use of the filters to do a multi select, then be able to do a multi deploy.

VRO Literal Map Viewer

The following snippet of code – I did not write, This will not become a theme… But it has become so useful to me, I wanted to have my own beautified jsdoc reference to it.

https://communities.vmware.com/thread/609702

thanks qc4vmware

/**
* converts literal map to a JSON string
* @param {string} literalMap - literal map in text
* @id 6ecaebc1-c70b-46b2-a2e1-2ce57ef03802
* @version 0.0.1
* @allowedoperations 
* @return {string}
*/
function literalMapViewer(literalMap) {
	var literalMap = extensionData;
	var jsonObj = convertLiteralMapToJson(literalMap);
	var json = JSON.stringify(jsonObj);
	System.debug(json);
	
	return json
	
	function convertLiteralMapToJson(literalMap) {
	    var mapObj = {};
	    for each(var key in literalMap.keySet()) {
	        var obj = literalMap.get(key);
	        var className = System.getObjectClassName(obj);
	
	        if (className == "vCACCAFEStringLiteral" || "vCACCAFEDateTimeLiteral" || "vCACCAFEIntegerLiteral" ||
	            "vCACCAFEBooleanLiteral" || "vCACCAFEDecimalLiteral" || "vCACCAFESecureStringLiteral") mapObj[key] = obj.getValue();
	
	        if (className == "vCACCAFEComplexLiteral") mapObj[key] = convertComplexLiteralToJson(obj);
	
	        if (className == "vCACCAFEMultipleLiteral") {
	            multClassName = System.getObjectClassName(obj.getValue());
	            mapObj[key] = [];
	            for each(var item in obj.getValue()) {
	                var itemClassName = System.getObjectClassName(item);
	                if (itemClassName == "vCACCAFEComplexLiteral") {y
	                    mapObj[key].push(convertComplexLiteralToJson(item));
	                }
	                else {
	                    System.debug("unhandled itemClassName::" + itemClassName);
	                }
	            }
	        }
	
	        if (className == "vCACCAFEEntityReference") {
	            mapObj[key] = {};
	            mapObj[key].referenceType = "vCACCAFEEntityReference";
	            mapObj[key].classId = obj.getClassId();
	            mapObj[key].componentId = obj.getComponentId();
	            mapObj[key].id = obj.getId();
	            mapObj[key].label = obj.getLabel();
	            mapObj[key].typeId = obj.getTypeId();
	            mapObj[key].label = obj.getLabel();
	        }
	
	        if (mapObj[key] == undefined) {
	            System.log(key + " unhandled className::" + className);
	            mapObj[key] = "QCconvertLiteralMapToJson: This mapping has an unhandled class for key:" + key + ", class: " + classname;
	        }
	    };
	    return mapObj;
	};
	
	function convertComplexLiteralToJson(complexLiteral) {
	    var complexClass = System.getObjectClassName(complexLiteral.getValue());
	    if (complexClass == "vCACCAFELiteralMap") {
	        return convertLiteralMapToJson(complexLiteral.getValue());
	    }
	    else {
	        System.log("not sure what to do with complexClass::" + complexClass);
	        return "QCconvertComplexLiteralToJson: This complex literal has an unhandled class:" + complexClass;
	    }
	}
};

NSX-T with Postman

After following the fantastic blog entry

To try and get the NSX-T postman collection running in my Postman…

I hit the Postman Import failure that seems to occur post version 7.2.0

Mine was at 7.3.6 but the issue was still in place

from the following blog

https://community.getpostman.com/t/cannot-import-swagger-2-0-file-anymore-it-was-working-before-updating-to-7-2-2/6545

I was able to follow the steps to

  • Downgrade my Postman
  • Import the Collection
  • Export the Collection
  • Upgrade my Postman
  • Import the Collection
  • Happy Days!

Temporary Lower Certificate security to connect VRA VRO

PowerVRA and PowerVRO do a fantastic job at assisting with connecting to VRO and VRA. When working with Self Signed certificates, using the switch “–IgnoreCertRequirements” will usually allow connection to proceed. Occasionally a more drastic approach is required. 

I often use the following bit of code to enable the connection. I do apologise for not being state the source of this, I have had it save in my library for quite a while and usually use it as is.

if (-not ([System.Management.Automation.PSTypeName]'ServerCertificateValidationCallback').Type)
{
$certCallback = @"
    using System;
    using System.Net;
    using System.Net.Security;
    using System.Security.Cryptography.X509Certificates;
    public class ServerCertificateValidationCallback
    {
        public static void Ignore()
        {
            if(ServicePointManager.ServerCertificateValidationCallback ==null)
            {
                ServicePointManager.ServerCertificateValidationCallback += 
                    delegate
                    (
                        Object obj, 
                        X509Certificate certificate, 
                        X509Chain chain, 
                        SslPolicyErrors errors
                    )
                    {
                        return true;
                    };
            }
        }
    }
"@
    Add-Type $certCallback
 }
[ServerCertificateValidationCallback]::Ignore()

$SecurityProtocols = @(
        [System.Net.SecurityProtocolType]::Ssl3,
        [System.Net.SecurityProtocolType]::Tls,
        [System.Net.SecurityProtocolType]::Tls12
    )
    [System.Net.ServicePointManager]::SecurityProtocol = $SecurityProtocols -join ","